Youth Olympics: Major IT milestones to date 'met'

IT plans for the inaugural Youth Olympic Games (YOG) to be held here this year have made progress and on track with scheduled deadlines, according to Atos Origin, the company responsible for building and operating IT infrastructure and systems at the Games.
The team has, to date, met major milestones outlined in the project, Yan Noblot, program manager at Atos Origin, declared in an interview with ZDNet Asia. The 34-year-old is the YOG chief integrator and has been involved in several Olympics such as the Athens 2004 Olympics, 2006 Winter Games in Torino and the initial phases of Beijing 2008 and London 2012.
A veteran Worldwide Olympic Partner, Atos Origin was appointed the overall IT lead by the Singapore Youth Olympic Games Organising Committee (SYOGOC) in December 2008 and is also responsible for building the Games Management System (GMS) and the Information Diffusion System (IDS). It currently has 34 employees dedicated to the YOG, with manpower ramping up closer to the event.
The IT services vendor, Noblot said, has since put into "production" several modules under the GMS to support preparation for the Games. It also launched in December 2009 an integration lab to work more closely with other participating sponsors. YOG technology systems are now hosted in a data center in the eastern part of the island-state.
In the near term, disaster recovery plans that include the identification of a redundant data center will also be finalized, he added. The Technology Operations Center (TOC), which serves as the IT command center during the YOG, will also be ready. The games will run over eight days from Aug. 14 to 26.
Compared to Beijing 2008's TOC, the Singapore center will be smaller, Noblot said.
Tapping social media
But while the YOG is generally acknowledged to run on a smaller scale than the summer and winter Olympic events, it has its unique differences.
Due to the lower median age of YOG participants, Noblot noted, there is significant emphasis on new media such as social networking and mobile applications. "Atos Origin as the primary integrator not only has to provide real-time information to those new applications, but we also have to support them," he said.
One example is the "Digital Concierge for Singapore 2010" app, which will be pre-loaded on Samsung's Omnia Lite B7300 smartphones for athletes and selected team officials to use during the Games. As a result of enhancements to the IDS, real-time schedule updates and results will be pushed to the phone via the Digital Concierge app, he said.
The Games will also introduce new events in some sports, as well as a new competition format, he added. In soccer, for instance, teams can comprise players of different nationalities and this creates additional requirements on the IT infrastructure, he explained.
According to Noblot, Atos is currently working with Omega, SingTel and Samsung to prepare the IT deployments needed for the YOG.
The PC vendor is expected to be Acer, which is the appointed sponsor for both the Vancouver 2010 and London 2012 games, and the company's logo is already visible on the official YOG Web site.
Noblot acknowledged that Atos Origin has been using computing equipment from a particular vendor, but the sponsor has not yet been identified by the SYOGOC.
READ MORE - Youth Olympics: Major IT milestones to date 'met'

M'sia govt touts 95 percent OSS adoption

Some 95 percent of Malaysia's government agencies have adopted open source software (OSS), but the remaining 5 percent have not warmed to the concept--and is unlikely to anytime soon, according to a government official.
During her presentation at the GovTech 2010 conference here Thursday, Tan King Ing, deputy director of ICT policy and planning at the Malaysian Administrative Modernization and Management Planning Unit (Mampu), said some 400 government agencies in the country have adopted OSS. The Mampu was set up in 2002 as part of the government's ICT masterplan, to explore the use of open source software (OSS) in the public sector.
While open source adoption efforts began in earnest in 2004 with 50 agencies, implementation figures began ramping up sharply in 2008 when the Mampu introduced migration and documentation support to move government workers from proprietary office software to OpenOffice.org, said Tan.
In 2008, the Mampu said 281 agencies had adopted OSS. By mid-2009, this figure rose to 71.1 percent.
Elaborating on the 5 percent that have not adopted OSS, Tan said these are "very small and far flung [user groups], without much IT resources or personnel". She noted a lack of enthusiasm on their parts regarding OSS because of the need for expertise to perform the migration.
She added that during a Mampu survey of government agencies, this 5 percent of users also indicated that they did not plans to adopt OSS in the future. And the Mampu is not expecting them to, either, Tan said.
Path to open source self-sufficiency
Describing the Malaysian government's path to OSS, she said most government agencies infrastructures were standalone proprietary legacy systems that did not interoperate.
The start of the initiative was also marked by "so much skepticism" toward OSS, prompting Mampu to drive five pilot projects to spread user acceptance, she said.
She added that, on hindsight, governments looking to follow in Malaysia's footsteps would better manage by adopting a broad implementation roadmap, including preparing proprietary business partners for such a significant change.
With the large majority of government bodies currently on OSS, the Mampu's next goal is to help these agencies achieve self-sufficiency so that they would be able to support their own OSS implementations, and write their own in-house applications, said Tan.
"We want Malaysia to become a technology producer, rather than exclusively technology consumers," she said.
READ MORE - M'sia govt touts 95 percent OSS adoption

Secure data between multiple mobile OSes

As people increasingly use their personal mobile devices such as laptops and smartphones for work, IT administrators have to look into protecting their enterprise networks from the perspective of securing the data, rather than the system.
According to Lawrence Goh, technology consulting lead for Accenture Asean, the process of managing risks from multiple devices, and with it the different operating systems (OSes), is essentially the "same as any enterprise risk management approach".
"The differences come in the implementation, where controls are needed to address the increased number of access points into the corporate world and the move toward the borderless corporate," said Goh in an e-mail interview with ZDNet Asia.
"The result is an approach that encompasses infrastructure, application architecture, identity and access management, endpoint encryption and data leakage prevention, to name just a few, and supported by the skills to assess ongoing risks and adequately respond to incidents," he added.
This view is supported by Ronnie Ng, manager of systems engineering for Symantec Singapore, who added that enterprises need to be able to manage data across multiple OSes, and yet remain platform-agnostic with their security and storage infrastructures.
"The most effective IT infrastructures are those that bring together security, storage and systems management to automate... [This] will enable enterprises to manage the risks and complexity driven by the increased proliferation of devices and operating systems, without increasing time and costs," Ng said.
It is not just network administrators who have to worry about securing enterprise data, though. Mobile OS providers have a part to play, too. For one, Microsoft, which develops the Windows Mobile platform, highlighted that phones using its software have key security elements that can be controlled to ensure the organization's data is protected but [still] accessible.
A Microsoft spokesperson told ZDNet Asia that the Windows Mobile OS is "built around a three-tier security model that prevents malicious software from getting access to device functionality and data". He added in an e-mail interview that phones running this OS meet industry standards such as the Common Criteria Security Ceritification AES 4+, a security certification required by over 25 governments worldwide.
However, despite all the measures deployed to safeguard the information flow between devices and networks, Accenture's Goh said the "main threat will always be the end-user".
He noted that all major incidents related to mobile devices over the last three years can be traced back to "process failure and lack of user diligence and understanding".
Mobile threats becoming sophisticated
There are also other external mobile threats to consider. For Ng, the number of attacks designed to exploit a certain OS or platform is directly related to the platform's market share, as malware authors are "out to make money and always want the biggest bang for their buck".
Citing Apple's products as an example, he said the OSX.Iservice Trojan targeting Mac users was a result of the company's rising popularity, and this is a trend that will continue in 2010.
To this end, cybercriminals have used applications such as Snoopware--a spy software commonly used by parents, spouses or employers to spy on people--as a way to remotely access smartphones for eavesdropping into confidential conversations, said Ng.
Meanwhile, "Pranking4Profit" is a class of attacks intended to steal money rather than data from compromised terminals, Ng highlighted. "This type of crimeware uses what is known as 'RedBrowser' to infect the phone and send premium short messaging service (SMS) messages from the device to a Web site that withdraws money from a bank or credit account, before the user or network becomes wise."
Failure to guard against such threats, he said, will lead to three areas of risks: compliancy, data and privacy, and business and network stability. Ng added that if left unprotected, "mobile devices represent the weakest link in an enterprise's IT infrastructure".
READ MORE - Secure data between multiple mobile OSes

Tech can 'smarten' up cities, planet

Cities are "microcosms" or miniature examples of the major challenges and opportunities facing the planet today, and can lead the way to a smarter planet by leveraging information and communications technology, said IBM executive.
At a media roundtable here Wednesday, Mark Cleverley, director of strategy for IBM's global government industry, touted sustainability in the company's "smart planet" initiative and reiterated its push to make cities "smarter".
Smarter Planet, Cleverley explained, is IBM's observation of how technology and society are changing to open up new solutions to problems the world faces today.
As the world continues to urbanize, problems such as transportation, food and water supply, are most stressed in cities, he said. "If we're going to build a smarter planet, we'll have to start from our cities, to solve problems at the most intense pressure points," he added.
IBM believes cities need to leverage information and communications technologies to achieve its vision of smarter cities, he said. Cleverley added that by linking the three key attributes of ICTs, which IBM identifies as the "3 I's", cities can address today's key challenges. The "3 I's" are:
  • Instrumented: Devices and sensors can be distributed in all sorts of places to measure, sense and see conditions of practically everything.
  • Interconnected: The information gathered by sensors can be communicated among people, systems and objects in more effective ways.
  • Intelligent: People can use analytics to draw insights from an even larger quantity and source of all the information gathered than before.
Describing cities as "systems of systems", Cleverley listed six "man-made systems": government services, transportation, energy and utilities, healthcare, public safety and education. IBM believes that by solving problems within an individual system, people can use the "3 I's" to gather more insights of how other systems interact.
Cleverley underscored the ability of technology to solve any problems. "The ability of technology to help us with problems is only limited by our imagination, and ultimately, by the ability of people to organize themselves to solve these problems," he said.
However, he noted that the vision can only be successful with more cities and people participating, or what he described as "coalitions of scale". "We're not pretending to be the best urban planners in the world," he said. "We need to work with people and we need to get the debate going in order to make certain things work."
"I think it's more about getting these coalitions together to enlighten them of what is possible," he said, adding that IBM hopes to bring the vision of what is possible to the leaders of cities. However, Cleverley added, it is up to the leaders and communities to decide which these problems, and solutions, are applicable to their individual cities.
READ MORE - Tech can 'smarten' up cities, planet

'Coopetition' the new norm in tech

With IT vendors looking for new revenue sources and expanding their business portfolio, "coopetition"--a mix of cooperation and competition--has now become the new norm in the tech landscape, notes an analyst.
Stuart Williams, senior analyst for enterprise software at Technology Business Research (TBR), said in an e-mail interview with ZDNet Asia that the largest and most influential IT vendors such as IBM, Hewlett-Packard, Microsoft, Cisco Systems, Google and Oracle, are "protecting their core markets by entering adjacent" ones. This inevitably creates competition with other large vendors they had previously partnered, Williams added.
Over the last two years, tech vendors have stepped up their market expansion activities.
In September 2008, Oracle announced its first hardware product, the Exadata server, that was jointly built with partner HP. Last year, Cisco made its foray into the data center business with its Unified Computing System, and both blade and rack servers. More recently, Google unveiled its own own Android-based phone, after it previously worked with handset vendors such as HTC, to launch Android phones.
Williams said: "'Coopetition' is the new normal for the large systems vendors. The way these large vendors create coopetition varies: Oracle is growing through acquisition, Cisco through internal development and alliances, Microsoft through internal expansion."
Given that these vendors now boast of multiple product lines spanning hardware, software and services, they will have encounters that are at times complementary, and others competitive. Such situations call for a balancing act, he said.
"The skill with which these vendors reduce the impact of the areas of competition on the benefits that they gain through cooperation, is a new core requirement for these large systems vendors," Williams explained.
In an e-mail statement, Google said it does not intend to compete with its partners--specifically, those in its Android ecosystem. "Our expectation is that the Nexus One will push the entire mobile ecosystem forward, driving greater innovation and consumer choice," said a Singapore-based spokesperson.
"We look forward to working with other hardware manufacturers to bring more Google-branded devices to market, and opening up these devices to operators around the world," he said. "Android remains an open source mobile platform, and we look forward to the innovation that will come from not only Google, but all Android partners."
In some cases, companies may choose to end the affiliation following certain developments within the corporate strategy.
Months after news broke that it had acquired Sun Microsystems, Oracle in September last year said it would leverage Sun's technology for its second generation of Exadata. The company reportedly ended its relationship with HP on the data storage server.
Allocation of company resources
Entry into unfamiliar markets could also put a strain on business models and corporate resources.
In the case of Google, its existing customer service model--built around user forums, Frequently Asked Questions and e-mail queries--may no longer be adequate for its new direct sales approach.
Some vendors have tweaked their product focus to be in line with shifting business strategies.
IBM, which history can be traced back to tabulating machines and dial recorders that track the number of hours worked, was previously a hardware-focused manufacturer which businesses included selling PCs and servers.
In the 2000s, however, it began downsizing its hardware business. In 2002, the company sold off its hard drive unit to Hitachi and in December 2004, did likewise for its PC business to Chinese hardware maker Lenovo. These business maneuvers planted the technology stalwart another step nearer to becoming a vendor focused on software and services.
Claudia Tan, general manager of general business at IBM Singapore, told ZDNet Asia in an e-mail that the company had "remixed our businesses in order to move to the emerging higher-value spaces".
As part of this move, Tan noted that IBM exited commoditizing businesses but at the same time, acquired 108 companies since 2000 for about US$22 billion. "These portfolio actions have contributed to a significant change in our mix of business," she said.
According to Tan, Big Blue had invested in capabilities such as business analytics and cloud computing, to accelerate the development of new market opportunities, and also rechanneled spending to areas of "greatest opportunity". In 2009, the company spend nearly US$6 billion on research and development work, she said.
"The unique portfolio of businesses we have built, heavily weighted toward software and services, generates high profitability," she noted. "We will continue to remix to [achieve] higher value through organic investments and acquisitions.
"We are also leveraging our scale and global footprint to improve processes and productivity in a number of areas, like support functions and service delivery."
Impact on industry
According to TBR's Williams, while large vendors can "enter and exit lines of business fairly easily", customers and smaller partners need to be aware of the impact of a changing landscape.
"Partners may be squeezed out as the large vendors enter lines of business previously served by the smaller partners," he noted. "Customers who use a multi-vendor strategy may need to reevaluate when their landscape gets rolled up into a 'single throat to choke'."
As it stands, tech companies are growing their business by aggressively driving expansion into new geographical markets such as the BRIC (Brazil, Russia, India and China) countries, business segments such as the midmarket and small businesses; and industry segments through specialization, said Williams. Consolidation is another avenue of growth.
"The recipe for success for these large vendors is a mix of a broad portfolio of offerings, a clear and compelling value proposition for why customers should adopt 'one-stop-shopping' and in building closer relationships to each customer," he explained.
Williams added: "The trend of industry consolidation will continue for the foreseeable future, especially in the emerging SaaS (software-as-a-service) and cloud space, where we believe 2010 will be a year of acquisition for the large vendors who are playing catch-up."
READ MORE - 'Coopetition' the new norm in tech

Microsoft eyes clean break with WinMo 7

Microsoft's long and winding road toward regaining lost ground in the cell phone business will reach an important milestone in Barcelona next month.
At the annual Mobile World Congress event, Microsoft will at long last show off will arrive in 2010 Windows Mobile 7--its oft-delayed major revamp of the decade-old Windows CE code base that has been at the core of its mobile operating system since the days of challenging the Palm Pilot.
Sources told ZDNet Asia's sister site CNET that Microsoft is still planning to finalize the code for Windows Mobile 7 by summer in order to have the new software on devices that ship before the end of the year.
Separately, though, Microsoft is also working on a new consumer phone line, early pictures of which cropped up last year, that is designed to be the next generation of the Sidekick product line that Microsoft inherited with its acquisition of Danger.
Although it is not a widely rumored "Zune Phone", the new consumer device is based on Windows Mobile and likely to be able to connect to Zune and other consumer services that Microsoft has been developing for some time now, sources said. That product, also due to arrive this year, should come earlier in the year ahead of Windows Mobile 7 devices.
Microsoft declined to comment on Windows Mobile 7 or the new consumer device, but Robbie Bach, the head of the company's entertainment division, did tell CNET in an interview at January's Consumer Electronics Show that Microsoft would have a lot more to say about the future of the phone business in Barcelona. Microsoft has also promised developers headed to the Mix 10 trade show in March that they will be able to get information on how to program for Windows Mobile 7.
"Yes, at MIX10 you'll learn about developing applications and games for the next generation of Windows Phone," Microsoft said on the Mix Web site in a Jan. 20 update. "Yes, we'll have Phone sessions, and we can't say more...yet."
Although Microsoft has typically been loath to make major changes to the desktop version of Windows at the expense of compatibility, the software maker appears ready to make a bigger break with its mobile past--a sensible move given its declining share of both the market and developer interest.
With Windows Mobile 7 hit by several delays, Microsoft last year released Windows Mobile 6.5, an interim update designed to make the current operating system more "finger-friendly" on touch-based devices. The company also rebranded devices using its operating system as "Windows Phones" and launched a new marketing campaign.
At the same time, though, longtime Windows Mobile phone makers including Motorola and HTC have been gravitating toward Google's Android mobile phone operating system. LG, which had planned to center its smartphone efforts on Windows Mobile, has also said it will offer a number of Android-based devices.
READ MORE - Microsoft eyes clean break with WinMo 7

Yahoo ends Bartz's first year on up note

Yahoo continued to ease its way back to financial respectability in its fourth quarter, beating estimates from both itself and Wall Street despite a decline in revenue.
In a press release Tuesday, Yahoo said it took in US$1.73 billion in revenue during its fourth quarter, down 4 percent from the same quarter last year but up 10 percent from the third quarter. That exceeds the high point (US$1.7 billion) of the guidance range Yahoo provided after the third quarter, and revenue of US$1.26 billion excluding traffic acquisition costs beat analyst estimates of US$1.23 billion.
"Things seem to be returning to a more normal state in the online ad business," said CEO Carol Bartz during a conference call to discuss the company's results Tuesday. Although the totals are still off compared to last year, both search and display ad revenue on Yahoo sites increased from the third quarter to the fourth, as Yahoo's clients became more confident about their marketing budgets and more willing to spend on Yahoo's inventory, she said.
Fourth-quarter net income was US$153 million despite the drop in revenue, compared to a loss of US$303 million a year ago. Last year's net income number, however, was hurt by a one-time write-off of goodwill. Excluding charges, non-GAAP earnings per share actually fell, from 21 cents last year to 15 cents this year, although Wall Street was expecting just 11 cents in earnings per share this time around.
For the full 2009 fiscal year, Yahoo's first under Bartz, the company recorded US$6.5 billion in revenue and net income of US$598 million. Revenue was down 10 percent compared to 2008, but net income was up 43 percent.
Yahoo attempted to shed both workers and businesses that it didn't like during 2009 in order to save money, but "2010 is not about divestitures for Yahoo," Bartz said. Instead, the company is planning "acquisitions and investments to make Yahoo even stronger", she said, although she cautioned that those acquisitions will be relatively small.
Yahoo said it expects to record between US$1.575 billion and US$1.675 billion in revenue during its second quarter, with income from operations at US$90 million to US$110 million. That does not include any effects that will come into play if the Microsoft search deal is finalized, Yahoo said.
The company did not provide an update on when that deal might be finalized during the conference call. Yahoo and Microsoft are talking to the Department of Justice about the potential anticompetitive impact of the deal, which tends to happen when two large players in a three-player market propose a merger.
Bartz likewise dodged a question regarding Yahoo's plans for its business activities in China. The company is an investor in Alibaba, which runs a Chinese-language content site bearing Yahoo's brand in China. Google, of course, recently threatened to pull out of China unless it can offer an uncensored search engine, and Secretary of State Hillary Clinton last week warned U.S. Internet companies that they have a "shared responsibility" to stand up against Internet censorship.
"We have a good relationship with them," Bartz said, even though Alibaba called Yahoo "reckless" for supporting Google's decision to speak out against cyberattacks believed to be the work of the Chinese government. Yahoo co-founder Jerry Yang sits on Alibaba's board of directors, she said.
As for other international topics, Bartz announced she's no longer looking for a management hire to head up Yahoo's international efforts, as had been the plan to date. Bartz said she couldn't find a candidate she liked, and so the new plan is to merge the Emerging Markets group into the other three geographical divisions: Asia-Pacific, Americas, and Europe, the Middle East, and Africa (EMEA). The current leaders of those groups will continue on, she said.
READ MORE - Yahoo ends Bartz's first year on up note

More regulation to clarify rising tech's biz models

Companies should expect to see more regulations introduced and increased government intervention take place as emerging technology gets deployed in the enterprise sphere, according to an analyst.
Steve Prentice, Gartner's fellow and vice president, highlighted four broad trends during the research firm's Gartner Predicts 2010 event on Tuesday. He believes that the following-- social computing, contextual computing, advanced analytics and cloud computing--will herald long-term changes in approach for IT professionals.
As these emerging technologies get adopted by companies, "regulation will start to come in", Prentice noted, citing the example of Internet advertising in the U.S..
Speaking to ZDNet Asia on the sidelines, Prentice noted that as a result of the increased regulatory scrutiny in the financial sector, this will "spin off" to other IT-related sectors. He brought up the environmental sector as a particular area that will "definitely be regulated", just that there is a lot of "uncertainty" over how to do so.
"Uncertainty is not helpful for business planning. So in the near- to mid-term as the market begins to grow, we should see more regulation take place as this will bring clarification to business models in these emerging technologies, which is a good thing," he added.
A world of information
Contextual computing uses location, presence, social attributes, and other environmental information to anticipate an end-user's immediate needs, offering more sophisticated, situation-aware and usable functions, said Prentice.
He forecasted that more than 7.3 billion networked devices and over 1.2 billion smartphones will be used in 2012, which will give the technology a boost, a potential that companies should look to unlock and leverage.
The other two big trends--analytics and cloud computing--have been much talked about in the enterprise space for some time now, and Prentice expects these technologies to develop further and come to the fore in 2010.
"The market will expand from the proprietary mega providers of today, to ecosystems and supply chains of providers, to thousands of smaller providers that rely on agility and standards for interoperability to compete," said Prentice of cloud computing. The maturing cloud had earlier been identified by fellow research company, IDC, to be one of the top 10 tech trends this year.
As for social computing, the Gartner analyst called organizations to stop blocking employees from Facebook or other social-networking sites, as they will "access these using parallel networks anyway". In his presentation, Prentice predicted that Facebook will become the "hub for social network integration and Web socialization by 2012". He also said it is better for companies to embrace the "consumerization of IT" and respond to the changing business landscape rather than to distance themselves from it.
This view was echoed by fellow presenter Andy Rowsell-Jones, vice president and research director of Gartner, who suggested that CIOs can leverage Web 2.0 and social computing to "enable mid-office activities".
To him, mid-office processes refer to where "coordination, collaboration, management and decision development [in the organization] occur". Offices with strong mid-office capabilities are able to make decisions faster, mobilize resources better and execute plans with greater efficiency, he added.
Rowsell-Jones noted that the power of utilizing Web 2.0 and social computing lies in the ability to increase collaboration innovatively.
"[These lighter-weight technologies] not only engage people, they also make it possible to combine personal/group knowledge with the information in corporate systems," he said in his presentation. "Without such a connection, social-networking solutions lack the scale and leverage to transform enterprise operations."
Modest IT budget increase
According to the CIO Survey 2010 by Gartner's Executive Programs (EXP), which Roswell-Jones mentioned in his presentation, CIOs are expecting their IT budgets to remain largely flat, with the weighted global average growing by a modest 1.3 percent, up from negative 8.1 percent a year earlier.
Commenting on this, Rowsell-Jones said he welcomes positive over negative growth any day. However, he observed that CIOs will be under pressure to deliver on more projects--many of them previously deferred by the global recession--as expectations of an economic recovery increase, but based on almost the same level of IT resources as 2009.
READ MORE - More regulation to clarify rising tech's biz models

AMD readies Fusion for 'new form factors'

AMD is set to release next year a new platform targeting "new form factors" under the ultrathin notebook category, an executive based in the island-state said Tuesday.
Codenamed Brazos, the platform will include Ontario, an APU (accelerated processing unit) under the chipmaker's Fusion strategy. Born out of the acquisition of graphics maker ATI Technologies, Fusion is AMD's branding for its plans to integrate a GPU and CPU on the same chip, and was initially targeted to launch in 2009.
Brazos follows Nile, AMD's 2010 ultrathin notebook platform, Tan See Ghee, marketing director for South Asia at AMD, said in a media briefing Tuesday on the company's roadmap. Brazos also has a "lower power envelope" compared to Sabine, another ultrathin notebook platform also to be released in 2011. Sabine will contain the Llano APU which had been taped out in 2009. All APUs are manufactured under the 32-nanometer process technology
The "new" form factors Brazos is angled toward refer to those new to the industry, not just AMD, he emphasized.
In 2008, Tan had noted that Fusion had characteristics that suit requirements of lightweight, small form factor notebooks.
When asked if AMD had plans for the tablet market that vendors such as Apple were reportedly moving toward this year, he pointed out that while AMD is "currently not painting a strategy" around tablets and slates, Brazos would be suited for form factors such as these.
AMD has already been working with Hewlett-Packard and some regional OEMS (original equipment manufacturers) on tablets, Tan said, but the HP models are all based on the Puma platform, introduced in 2008. AMD will continue to monitor the "exciting" tablet market to determine the right time to introduce new offerings, he added.
PC and desktop offerings to be expected
Other than Nile, AMD will introduce three other platforms for personal computers in the first half of this year. Leo, for "enthusiast" desktops, will contain CPUs with up to six cores, while Dorado is the company's mainstream desktop platform. In the mainstream notebook category, Danube will feature triple-core and quad-core processors. All platforms released in 2010 will be based on the 45-nm process.
On the server side, the chipmaker will roll out two platforms for different types of customer usage. Maranello, the higher-end AMD Opteron 6000 series geared toward high-performance computing, will be paired with the eight- and 12-core processor Magny-Cours and come in two- and four-socket options.
San Marino and Adelaide, which represent the AMD Opteron 4000 series for cloud computing and other functions requiring energy-efficiency, include the Lisbon four- and six-core chip and come in one- and two-socket options. All server platforms will be released in the first quarter of this year, according to Tan.
Also in 2010, AMD expects to allow customers to sample Bulldozer and Bobcat processors as well as Fusion products.
Tomo Kamiya, AMD's regional vice president for South Asia, noted that the company "seriously believes" the time has come to invest in the region. Citing AMD's recent Q4 and full-year 2009 financial performance, he said AMD was in a favorable position heading into 2010.
"From a [fabless] business model perspective and a financial perspective, we're in a very good position going forward," said Kamiya.
READ MORE - AMD readies Fusion for 'new form factors'

Juniper, Polycom forge alliance

Juniper Networks and Polycom have teamed on a joint telepresence and videoconferencing offering that's designed to serve as a counterweight to Cisco Systems and its recent acquisition of Tandberg.
In a statement Monday, Juniper and Polycom said they will optimize their platforms so service providers can offer video and telepresence cheaply. The argument: it's cheaper for enterprises to deploy telepresence as a service from their network providers instead of building out their own networks.
The deal with Juniper highlights Polycom's partnership strategy. Polycom last week announced a global reseller agreement with Siemens. Polycom also highlighted a telepresence demo with IBM at CES 2010. The aim for Polycom: forge partnerships that allow it to surround Cisco Systems and its recent acquisition of Tandberg.
Read more of "Juniper, Polycom forge telepresence, video-conferencing alliance" at ZDNet.
READ MORE - Juniper, Polycom forge alliance

StopBadware goes nonprofit with funding

StopBadware, the antimalware effort run out of Harvard's Berkman Center for Internet & Society, is spinning off to become a separate nonprofit with funding from Google, PayPal, and Mozilla, the organization was set to announce on Monday.

StopBadware was launched four years ago to help companies keep spyware, viruses, adware, and other malware off their sites. The project collects and analyzes data from Web sites and advocates for safer practices.
The group's "badware alerts", expose applications that violate its badware guidelines and have AOL, Real Networks, Sears, and others to change their practices regarding customer choice. StopBadware also collaborates with Google in warning users about Web sites that can install malware on visitors' computers.
"StopBadware has grown in just a few years from the seed of an idea into an internationally recognized force in the fight against harmful software," Urs Gasser, executive director of the Berkman Center, said in a statement. "We are proud that, by developing a unique mission and becoming independent, StopBadware now follows in the footsteps of previous ventures like Creative Commons and Global Voices that have their roots here at the Berkman Center."
Serving on the board of directors of StopBadware will be Harvard Law professor John Palfrey; PayPal Chief Information Security Officer Michael Barrett; Vint Cerf, Google Chief Internet Evangelist; Esther Dyson, an angel investor for startups; Mozilla Chief Evangelist Mike Shaver; Ari Schwartz, chief operating officer at the Center for Democracy & Technology; and StopBadware Executive Eirector Maxim Weinstein.
StopBadware did not disclose how much funding it was receiving from the investors.
READ MORE - StopBadware goes nonprofit with funding

White House puts companies on notice in Chin

U.S. Internet companies might soon need to find a new strategy for dealing with China.
In announcing that it is now U.S. policy to advocate a free and open Internet around the world, Secretary of State Hillary Rodham Clinton on Thursday essentially dared U.S. companies to follow Google's lead and put an end to their complicit censorship of Internet content. Google has saidit will shut down its Chinese search engine if it can't find a way to offer an uncensored version under Chinese law, and while no one else has jumped on that bandwagon, they may soon have little choice.
"...We are urging U.S. media companies to take a proactive role in challenging foreign governments' demands for censorship and surveillance. The private sector has a shared responsibility to help safeguard free expression. And when their business dealings threaten to undermine this freedom, they need to consider what's right, not simply what's a quick profit," Clinton said in remarks Thursday at the Newseum, before an audience including members of Congress, representatives from nonprofit groups, and perhaps more than one Internet company executive forced to ponder the meaning of that paragraph.
Clinton stopped short of actually proposing regulations or sanctions on Internet companies that comply with censorship laws. But her tone was clear: it's now the policy of the U.S. government to renounce corporate "engagement," or the belief that by merely being in countries like China, U.S. Internet companies are helping expand access to information.
Will it work? Google, Microsoft, and Yahoo have already formed the Global Network Initiative, a consortium of companies and organizations designed to provide guidelines for operating in countries with authoritarian governments without turning into tools of those governments. Clinton acknowledged the work of the GNI during her speech, but is calling on companies to do more.
Microsoft declined to directly address its plans for China in a statement, but thanked Clinton for recognizing the GNI. "We welcome Secretary Clinton's remarks and applaud the heightened attention she has brought to these issues of privacy and freedom of expression. We agree with Secretary Clinton that both governments and the private sector have important roles to play," the company said. Last week, Microsoft CEO Steve Ballmer said that the company remained committed to China despite Google's announcement.
Google, which was recognized during Clinton's speech for "making the issue of Internet and information freedom a greater consideration in (its) business decisions," said it welcomed the challenge. "Free expression and security are important issues for governments everywhere, and at Google we are obviously great believers in the value to society of unfettered access to information. We're excited about continuing our work with governments, human rights organizations, and bloggers, to promote free expression and increased access to information in the years ahead," it said in a statement.
Yahoo did not respond to a request for comment.
Rebecca MacKinnon, a fellow at the Open Society Institute and member of the Global Network Initiative, compared Clinton's push to similar standards companies have been forced to adopt over the ages when operating outside of U.S. laws, such as avoiding the use of child labor in other countries.
Voluntarily adhering to those standards, however, will require corporations to do something corporations tend to dislike: decrease revenue, increase costs, and reduce profits.
"Companies are beginning to look at the long-term interest in an open Internet and understanding there are some short-term costs to functioning that way," said Sally Wentworth, regional director for North American at the Internet Society, a nonprofit that focuses on global Internet standards and education.
China, already with the most Internet users in the world despite having only 25 percent of its population online, is a huge source of future growth for Internet companies. U.S. companies have invested billions in China, not only in well-known areas like manufacturing and software development, but in hopes of courting that enormous future audience that will eventually be searching, watching videos, and consuming news--of some type--on the Internet.
But with Clinton's remarks, U.S. companies are in an even more difficult place than they were when Google made its announcement last week. Will they have a harder time getting government contracts if they do business with the Chinese government? Will there be additional taxes, or even eventually fines for following censorship laws in other countries?
After all, U.S. companies can't go breaking laws they don't like around the world, but they can refuse to subject themselves to those laws if they can be convinced that it will eventually be worth their while.
"Their business depends upon trust," MacKinnon said, referencing Google, Yahoo, Microsoft, and other Internet companies. Yahoo is still smarting in China from its decision to hand over information regarding a Chinese dissident to the government in 2005, and U.S. Internet companies could make a brand for themselves in China if they stand up to the government: the Great Firewall can't get everything.
Still, there's a sense that some tech companies would rather the U.S. force some sort of move from China on Internet censorship, instead of having to decide for themselves which countries are open for business and which aren't.
"We look to the U.S. government to address laws and practices in other countries that either facilitate censorship, oppression or a fractured Internet, or are unhelpful to International cooperation in cybersecurity and law enforcement," said TechAmerica, a tech industry trade group, in a statement. "Except in cases involving outright sanctions asserted by the U.S. government, American values also require the freedom of enterprise: Each company must decide where to do business on behalf of its customers, employees, and investors."
This policy will take some time to evolve, as Clinton and other speakers Thursday noted that this effort is merely the beginning of a long road toward promoting an open Internet around the world.
Still, she knows that while the moral pitch is easy, the business end will be harder.
"American companies need to make a principled stand," Clinton said. "This needs to be part of our national brand. I'm confident that consumers worldwide will reward companies that follow those principles."
READ MORE - White House puts companies on notice in Chin

Facebook's 'Dashboard' will clean up apps soon

A post Thursday on the Facebook developer blog announced that in a few weeks, the social network will be launching its "Dashboard" for third-party applications built on its platform. Right now, developers are allowed access to a "sandbox" where they can experiment with the new format and see what will be different.
What will the dashboard, part of a newly organized Facebook home page, bring to ordinary users? For the most part, it cleans up the Facebook app experience for users who may have installed dozens of third-party applications, and separates games--many of the platform's biggest sensations--into their own tab.
Applications that a user has used recently are grouped at the top, displaying news updates like "It's your move in your Scrabble game against Bill Gates". There's also a grid to show which applications that Facebook member's friends have been interacting with recently, along with suggestions for others that they might like based on what they already use. A set of "counters" in the left-hand sidebar shows members that they have updates and alerts from individual applications.
Developers haven't always been thrilled by Facebook's occasional crackdowns on what apps can and can't do, from e-mail notifications to prominence on members' profiles. But redesigning and cleaning up the app experience so that it's easier for Facebook users to find new apps and get more updates from the ones they already use is likely to be a well-received move.
READ MORE - Facebook's 'Dashboard' will clean up apps soon

Street View patent doesn't bar competitors

Google's bid to patent ads within its Street View application will not bar other maps providers from selling ads within their online maps, says a lawyer.
The search giant earlier this month was granted a patent protecting the process of displaying ads on its street-level map imagery, called Street View. The patent description details Google's use of its technology to identify spaces such as posters and billboards, within the Street View imagery and selling them off to advertisers.
Street View currently uses a technology that automatically scans and identifies sensitive information such as faces and car license plates, and blurs these images out.
While Google said it has no plans to put ads in Street View, there are concerns online about whether this prevents other providers, such as Microsoft or MapQuest, from selling ads on their maps.
Han Wah Teng, associate director at Singapore-based law practice, Nanyang Law, said in a phone interview that Google's patent covers the process used, detailing how a space is automatically identified and how ads are placed over these spaces.
That means a competitor intending to display ads through another method--for example, by manually identifying spaces, instead of automatically--could still sell ads on their maps, said Han. Since a manual method of selling ads uses existing technology that is not covered by the patent, maps providers that use such methods would not be breaching the patent, he explained.
Google's patent, filed in 2008, also describes an advertising auction process, which links the image of a property to its owner, or puts it up to the highest bidder. This opens the possibility for a business competitor of the company photographed to take out an ad on its building. For instance, a Street View billboard located next to a restaurant may end up displaying an ad for a competing establishment, if the latter wins the space.
Han said this may open an ethical debate but within the eyes of the law, it is not prohibited yet. Since the Street View images are copyrighted by Google, the company can manipulate them in any way it deems fit, he said.
"There is the broader question of whether Google should be allowed to take photos of buildings and manipulate them in that way, but that [is an ethical] debate [that] falls outside IP (intellectual property) law.
"There are no laws regulating such considerations at this point," he noted.
READ MORE - Street View patent doesn't bar competitors

China denies role in Google cyberattacks

After warning of strained U.S.-China relations, China's government has issued a statement denying any state involvement in the cyber attacks on Google and some 30 other companies.
The statement, issued Monday Beijing time by China's Ministry of Industry and Information Technology and carried on the state news agency Xinhua, comes at a time of heightened tension between China and the United States over Internet censorship and security in China.
The "accusation that the Chinese government participated in (any) cyberattack, either in an explicit or inexplicit way, is groundless and aims to denigrate China," an unidentified ministry spokesman told Xinhua, according to an Agence France Presse report. "We are firmly opposed to that."
U.S. Secretary of State Hillary Rodham Clinton formally denounced Internet censorship in a speech Thursday that was directed both at the private and public sectors. For corporations, she said, "Censorship should not be accepted by any company from anywhere. American companies need to make a principled stand."
China, which has stated that companies doing business in that country must respect and adhere to its laws, responded by warning that the new U.S. stance could hurt relations between the two countries.
"The United States has criticized China's policies to administer the Internet and insinuated that China restricts Internet freedom...This runs contrary to the facts and is harmful to China-U.S. relations," a Chinese Foreign Ministry spokesman said.
Google disclosed the attacks targeting it and other U.S. companies on January 12 and said the attacks originated in China. The company said it discovered the attacks in mid-December and while it did not specifically implicate the Chinese government, it says that as a result of the incidents, it may withdraw from doing business in China.
Source code was stolen from some of more than 30 Silicon Valley companies targeted in the attack, sources said. Adobe Systems has confirmed that it was targeted by an attack, and sources have said Yahoo, Symantec, Juniper Networks, Northrop Grumman, and Dow Chemical also were targets.
READ MORE - China denies role in Google cyberattacks

Twitter may be the most popular microblogging service, but its competitors are targeting a different audience as well as finding markets in other countries through affiliation. Jeffrey Mann, Gartner's research vice president for collaboration and social software, told ZDNet Asia in an e-mail interview that although being a first mover in the microblogging world has benefited Twitter and the site has seen adoption worldwide, its market dominance is concentrated in Europe and North America. In Asia, Plurk has done well because it made support for local languages a priority, said Mann, adding that Me2Day is popular in Korea, while Zuosa is gaining mindshare in China. He noted that affiliation with other social networks in each country also plays a part in the popularity of microblogging services, although language and character set support are still "big factors". He said developing games that use virtual currency on the microblogging platform is also a way to gain popularity in Asia. Springboard Research's senior services research analyst, Sanchit Vir Gogia, told ZDNet Asia in an e-mail interview that for microblogging sites, which are also social networking Web sites, to be successful, an important strategy would be their integration with other players in the space. "It's important to remember that while people are craving to communicate their thoughts and ideas with the world at large, it is both tedious and time consuming for them to maintain a plethora of such accounts and separately update them," Gogia explained. Affiliation with other social networks worked for Twitter, too. According to the Springboard analyst, Twitter picked up real pace after linking up with Facebook and LinkedIn. "This [tie-up] has allowed users to air the same view at once across these social networking sites, it also allows them to better manage their posts," he said. Other factors that can help drive adoption are unique services available on the microblogging sites and the richness of features provided, that is, whether the features integrate with third-party apps or allow the integration of pictures, videos and files for real-time information exchange, Gogia said. Microblogging for enterprise Asked if a single microblogging platform will emerge as the sole market leader, Mann said it was unlikely because local differences will remain. He pointed out that there is already a thriving business for enterprise microblogging services, provided by market players such as Yammer, Present.ly, Socialtext and Blogtronix, which are aimed for internal use within a corporate environment. Phil Spitzer, spokesperson from Yammer, told ZDNet Asia in an e-mail interview that the company's service is different from other microblogging services because it operates within a private network. "While services like Twitter are about being as public as possible, Yammer is about being as private and secure as possible," Spitzer said. "Because privacy and security are two cornerstones of Yammer, it's a great solution for enterprises." He explained that Yammer acts as a central platform for users to connect through communication channels such as e-mail, desktop clients and mobile clients. It provides threaded messages and attachments, and stores the information in the cloud with search capabilities, he said. According to Springboard Research's Gogia, as enterprise microblogging services target a niche market, they might never reach the critical mass as seen by Twitter, and will have to work with bigger players such as Twitter and other social networks, to increase user adoption. Despite the popularity of dedicated microblogging services, Gartner's Mann said the concept does not seem to have caught on in Japan. "A big reason is that the Japanese have been blogging for years from their mobile phones," he said. "These blog posts are often [already] very short, so that they don't see the point of a dedicated microblogging service."

Eastman Kodak is suing Apple and Research In Motion, claiming their camera-enabled smartphones infringe on its patents and asking for US imports of the handsets to be stopped.
The camera maker has filed a complaint with the US International Trade Commission (ITC) against the companies, which alleges that iPhones and camera-enabled BlackBerry devices violate its patent for color image previewing, Kodak said on Thursday.
It has also lodged two separate patent lawsuits against Apple in the US District Court for the Western District of New York, one focusing on color image previewing and processing and the other on digital-camera and computer-process technology.
Read more of "Kodak sues Apple and RIM over camera tech" at ZDNet UK.
READ MORE - Twitter may be the most popular microblogging service, but its competitors are targeting a different audience as well as finding markets in other countries through affiliation. Jeffrey Mann, Gartner's research vice president for collaboration and social software, told ZDNet Asia in an e-mail interview that although being a first mover in the microblogging world has benefited Twitter and the site has seen adoption worldwide, its market dominance is concentrated in Europe and North America. In Asia, Plurk has done well because it made support for local languages a priority, said Mann, adding that Me2Day is popular in Korea, while Zuosa is gaining mindshare in China. He noted that affiliation with other social networks in each country also plays a part in the popularity of microblogging services, although language and character set support are still "big factors". He said developing games that use virtual currency on the microblogging platform is also a way to gain popularity in Asia. Springboard Research's senior services research analyst, Sanchit Vir Gogia, told ZDNet Asia in an e-mail interview that for microblogging sites, which are also social networking Web sites, to be successful, an important strategy would be their integration with other players in the space. "It's important to remember that while people are craving to communicate their thoughts and ideas with the world at large, it is both tedious and time consuming for them to maintain a plethora of such accounts and separately update them," Gogia explained. Affiliation with other social networks worked for Twitter, too. According to the Springboard analyst, Twitter picked up real pace after linking up with Facebook and LinkedIn. "This [tie-up] has allowed users to air the same view at once across these social networking sites, it also allows them to better manage their posts," he said. Other factors that can help drive adoption are unique services available on the microblogging sites and the richness of features provided, that is, whether the features integrate with third-party apps or allow the integration of pictures, videos and files for real-time information exchange, Gogia said. Microblogging for enterprise Asked if a single microblogging platform will emerge as the sole market leader, Mann said it was unlikely because local differences will remain. He pointed out that there is already a thriving business for enterprise microblogging services, provided by market players such as Yammer, Present.ly, Socialtext and Blogtronix, which are aimed for internal use within a corporate environment. Phil Spitzer, spokesperson from Yammer, told ZDNet Asia in an e-mail interview that the company's service is different from other microblogging services because it operates within a private network. "While services like Twitter are about being as public as possible, Yammer is about being as private and secure as possible," Spitzer said. "Because privacy and security are two cornerstones of Yammer, it's a great solution for enterprises." He explained that Yammer acts as a central platform for users to connect through communication channels such as e-mail, desktop clients and mobile clients. It provides threaded messages and attachments, and stores the information in the cloud with search capabilities, he said. According to Springboard Research's Gogia, as enterprise microblogging services target a niche market, they might never reach the critical mass as seen by Twitter, and will have to work with bigger players such as Twitter and other social networks, to increase user adoption. Despite the popularity of dedicated microblogging services, Gartner's Mann said the concept does not seem to have caught on in Japan. "A big reason is that the Japanese have been blogging for years from their mobile phones," he said. "These blog posts are often [already] very short, so that they don't see the point of a dedicated microblogging service."

Niche, integrated support to benefit Twitter rivals

Twitter may be the most popular microblogging service, but its competitors are targeting a different audience as well as finding markets in other countries through affiliation.
Jeffrey Mann, Gartner's research vice president for collaboration and social software, told ZDNet Asia in an e-mail interview that although being a first mover in the microblogging world has benefited Twitter and the site has seen adoption worldwide, its market dominance is concentrated in Europe and North America.
In Asia, Plurk has done well because it made support for local languages a priority, said Mann, adding that Me2Day is popular in Korea, while Zuosa is gaining mindshare in China.
He noted that affiliation with other social networks in each country also plays a part in the popularity of microblogging services, although language and character set support are still "big factors". He said developing games that use virtual currency on the microblogging platform is also a way to gain popularity in Asia.
Springboard Research's senior services research analyst, Sanchit Vir Gogia, told ZDNet Asia in an e-mail interview that for microblogging sites, which are also social networking Web sites, to be successful, an important strategy would be their integration with other players in the space.
"It's important to remember that while people are craving to communicate their thoughts and ideas with the world at large, it is both tedious and time consuming for them to maintain a plethora of such accounts and separately update them," Gogia explained.
Affiliation with other social networks worked for Twitter, too.
According to the Springboard analyst, Twitter picked up real pace after linking up with Facebook and LinkedIn. "This [tie-up] has allowed users to air the same view at once across these social networking sites, it also allows them to better manage their posts," he said.
Other factors that can help drive adoption are unique services available on the microblogging sites and the richness of features provided, that is, whether the features integrate with third-party apps or allow the integration of pictures, videos and files for real-time information exchange, Gogia said.
Microblogging for enterprise
Asked if a single microblogging platform will emerge as the sole market leader, Mann said it was unlikely because local differences will remain.
He pointed out that there is already a thriving business for enterprise microblogging services, provided by market players such as Yammer, Present.ly, Socialtext and Blogtronix, which are aimed for internal use within a corporate environment.
Phil Spitzer, spokesperson from Yammer, told ZDNet Asia in an e-mail interview that the company's service is different from other microblogging services because it operates within a private network.
"While services like Twitter are about being as public as possible, Yammer is about being as private and secure as possible," Spitzer said. "Because privacy and security are two cornerstones of Yammer, it's a great solution for enterprises."
He explained that Yammer acts as a central platform for users to connect through communication channels such as e-mail, desktop clients and mobile clients. It provides threaded messages and attachments, and stores the information in the cloud with search capabilities, he said.
According to Springboard Research's Gogia, as enterprise microblogging services target a niche market, they might never reach the critical mass as seen by Twitter, and will have to work with bigger players such as Twitter and other social networks, to increase user adoption.
Despite the popularity of dedicated microblogging services, Gartner's Mann said the concept does not seem to have caught on in Japan. "A big reason is that the Japanese have been blogging for years from their mobile phones," he said. "These blog posts are often [already] very short, so that they don't see the point of a dedicated microblogging service."
READ MORE - Niche, integrated support to benefit Twitter rivals

Report: New York Times to charge online readers

The New York Times is reportedly getting ready to charge readers for access to the venerable newspaper's online content.
The newspaper is expected to announce in coming weeks that it will institute a metered pay plan in which readers would have access to a limited number of free articles before being invited to subscribe, according to a report in New York magazine that cited sources close to the newsroom.
The report also suggests that a content deal could be in the works for Apple's long-rumored tablet, which many expect to be unveiled on January 27. Apple has reportedly been shopping its device to media companies in Australia to gauge interest in having their products available on the device when it's released.
A New York Times representative's comment seemed to indicate that changes were coming to the Web site.
"We'll announce a decision when we believe that we have crafted the best possible business approach," Times spokesperson Diane McNulty said. "No details till then."
As readers have increasingly gone online for their news, papers have suffered declining subscriber numbers and lower advertising revenue, resulting in a dramatic industry contraction. Newspaper publishers and the Associated Press have blamed Google and other news-aggregation sites for their woes, leading to threats they will delist their content and begin charging online readers.
In a bold move, New York newspaper Newsday announced plans in February 2009 to begin charging online readers for access to its content. Newspapers such as the San Francisco Chronicle have tried to push readers back to buying the physical newspaper by promoting "print-only content" that features popular features and columnists formerly also available on the newspaper's Web site.
Among the country's largest newspapers, only The Wall Street Journal has managed to continue charging online subscription fees. The New York Times abandoned a two-year experiment with the Web-subscription model in 2007, suggesting that the company's projections for subscriber revenue were small compared with advertising sales.
However, such a plan isn't likely to garner much support from readers. A Harris poll released earlier this month found that 77 percent said they wouldn't pay anything to read a newspaper's stories on the Web. Of those who indicated they were willing to be charged for access to content, 19 percent would pay between US$1 and US$10 a month.
READ MORE - Report: New York Times to charge online readers

Enterprise demands consumer-style search

Enterprise search needs to take into consideration more parameters than consumer Web search, and yet present it in a consumer-friendly wrapping for business users, say industry voices.
With business users more accustomed to hopping on the Web and performing searches today, enterprise search has to keep up with consumer search, while including the additional parameters and layers that effective enterprise search demands.
Cheong Weng Seng, IBM's collaboration Asean sales executive, told ZDNet Asia in an e-mail interview that enterprises are finding it a "challenge" to present to users a "consumer" experience on top of the backend mechanics of enterprise search.
Singapore systems integrator NCS, had said in a previous interview that the country's workers are more interested in search, thanks to the Web, and demanding enterprise search tools similar to what they are used to. As a result, users are turning away from the "traditional" enterprise taxonomies and embracing natural language search, it said.
Cheong agreed: "Largely influenced by the consumer search experience, enterprise users have a set of clear and demanding expectations about how enterprise search should look, feel and perform."
Unlike the consumer Web, enterprise search must be tuned by the IT department to correctly reflect relevant results, he said.
Tied to this is the practice of knowledge management, which involves linking an organization's intellectual properties--in the form of data--with domain knowledge of its employees, and translating the information to be used in business strategy and practice, Cheong noted.
Toeing the line between enterprise and consumer search has been tricky for enterprises.
In 2006, IT analyst firm Gartner, noted that Google's enterprise search appliance offerings were immature compared to the competition, and advised enterprises to look toward enterprise-specialist offerings. The Internet search company last year announced its version 6.0 search appliance has been amped up to handle "billions" of documents.
Social functions of enterprise search
The world of knowledge management has also evolved to include social aspects, Cheong added. "According to a survey by Harris Interactive, two thirds of those polled required the help of colleagues who have the relevant knowledge or expertise to complete a task, but needed help in finding them.
"If you're faced with a problem, you're more likely to call a helpdesk than search for the answers in a manual," he said.
As a result, wiki, blogs and platforms such as Facebook and Twitter, form an additional layer of information for enterprise search now has to incorporate, said Cheong.
James Chia, head of Fuji Xerox Singapore's document solutions group, said enterprise content management tools have also evolved to include Web 2.0 platforms for users to share information with colleagues.
Information from these disparate sources are filed together with other records belonging to the enterprise, he said.
Locating the right data amid a company's crowded database then requires better search capabilities.
Putting new layers of information for an enterprise differs from that of consumer Web, said Cheong. For example, one way social media gets indexed with the correct relevance in an enterprise is through tagging. This practice enforces peer review to correctly classify data, increasing contextual relevance, he explained.
Chia said enterprises require richer search capabilities tied to job functions: an engineer searching for "experts" more likely wants the results to be related to his field or research area, for instance.
Enterprises also need added parameters, such as finding who else frequently searches for the same information. Such information could go toward linking up with relevant contacts, and hence, saving time, he explained.
Cheong further noted that the practice of taking into account the number of inbound links to a document, to increase its weightage, is only relevant for Web searches. "In enterprise search, the number of hyperlinks is limited so the relevance ranking based on links is not useful.
"Other factors such as content, context, and reuse of the document might be considered more important," he said.
Lan Chi Fei, Canon Singapore's senior marketing manager for business solutions and business imaging, said in an e-mail that enterprise search data needs to be retained over a period of time for regulatory compliance, yet, indexed efficiently enough so that it can be readily available when required.
"Accountability of information is the key differentiator and document management software that allows enterprise to store data securely and efficiently holds the key to successful enterprise search," said Lan.
READ MORE - Enterprise demands consumer-style search

5 must-haves in patch management

Despite the emergence of high-profile hacks and a narrowing window for vulnerabilities to be exploited, not all computers have the latest security patches, allowing cybercriminals to create havoc on systems that are not kept up-to-date.
The Conficker saga highlighted this tardiness last year after a flood of computers were infected by the worm over several months, even after a patch was made available by Microsoft in October 2008. In April 2009, Sophos' senior technology consultant Graham Cluley blogged that 11 percent of users who had taken the company's endpoint assessment test online, had yet to patch their systems.
Patch management is not the only aspect of enterprise security, but security vendors point to it as an important piece of the corporate IT protection puzzle.
ZDNet Asia spoke with three industry experts to highlight some essential elements that a sensible patch management strategy should include:
1. Vulnerability assessment
Ronnie Ng, Symantec's senior manager of systems engineering for Singapore, said in an e-mail that organizations need to constantly identify vulnerabilities and assess risks, in obtain to obtain "an accurate snapshot of possible threats to their IT environment".
Policies for patch management and risk mitigation should also be habitually reviewed, Ng added.
2. Patch acquisition and testing
Enterprises, he said, ought to monitor closely the availability of patches or updates. "The available patch's relevance and criticality to the IT environment should then be determined, and its source and integrity verified."
IT security administrators must also ensure patches will not damage or come in conflict with existing configurations or applications, he added. This can be achieved by deploying patches in a test setup that closely mirrors the production environment.
On top of that, the network should also be validated to ensure it can sustain the updates, said Ng, adding that patches should be deployed in a controlled and predictable fashion.
3. Smart patch deployment
He noted that communication is an important aspect during the deployment phase. When implementing patches, businesses need to keep users informed of patch rollout schedules, he explained.
Mark Goudie, Asia-Pacific managing principal for investigative response at Verizon Business, pointed out that enterprises should minimize the patch packages, and keep things simple in their environment.
"Why use patch packages that you do not need? An attacker will happily use a package that you do not use but have installed on your system, to break into your enterprise," Goudie said in an e-mail. "The smaller the number of installed packages, the less moving parts in the system and therefore, [fewer] things can go wrong."
At the end of the day, organizations should not rush to patch as breaches resulting from of zero-day exploits are rare, he said. "The 2009 Data Breach Investigations Report showed only one vulnerability exploited that had a patch available for six months," he noted. "The other exploited vulnerabilities that led to a data breach had patches available for more than a year."
Patrick Chan, IDC's chief technology advisor for emerging technologies, said in an e-mail interview that enterprises need to have a contingency plan in place for "rollback", should a particular patch happen to "break" the system.
Forward-looking organizations, he observed, are taking proactive steps to refine their patch management practices, including centralizing patch management efforts and roles, testing patches within enclosed virtualized environments before implementation, and leveraging virtualization technologies to patch a single image and deploy to many instances.
Putting in place IT governance relating to the use of automation tools, Chan added, is "imperative" for organizations. This includes logging actions and changes made during the patching process.
"Some organizations are increasingly looking to shift some of these responsibilities to their outsourced vendors and cloud computing platform providers," he said. "However, for on-premise assets, organizations still need a stringent practice [such as] ITIL (IT Infrastructure Library) to introduce lifecycle approaches to patch management."
4. Patch consistency
After the deployment, organizations need to verify that patches have been properly and successfully applied to all systems that require the update.
"Patching must be done completely; companies can't afford to miss a few systems [because] they were hard to patch, or the patch did not apply and no one checked it," Goudie pointed out. "Hackers will check your patch management systems for you."
To that end, development and testing systems ought to be included, he said. "If the system has production data or access to production data, it must be properly patched and managed.
"Speed is not the key to being safe, consistency is," said Goudie.
Symantec's Ng added that enterprises should carry out regular inventory checks on both hardware and software so that they have an updated view and are kept up-to-date on the resources that need to be protected.
5. External validation
According to Goudie, enterprises should also seek a third-party view of their patch management environment. "We all tend to look at our own work and see what we think is there, rather than what is really there.
"Get someone from a different part of the company--an auditor [or] a consultant to review your patch management system and processes, and test what has been deployed," he said.
Should fixed schedules for patching be eliminated?
Symantec's Ng noted that the time difference between the announcement of a vulnerability and the release of an exploit code, has "shortened dramatically", where businesses now have less than a week to patch holes. In addition, attacks are also becoming more malicious and sophisticated.
In view of this, software vendors and developers should not just focus on a fixed patch schedule but instead employ a combination of a fixed cycle, and a "when necessary" routine for greater flexibility and ability, he pointed out.
"However, it is important to realize that patch management is a two-way street," Ng said. "Vendors may provide the relevant patches in a timely manner, but it rests with the organization to apply them as quickly as possible.
"The failure to deploy patches promptly or correctly can bring about significant impact to the organization such as mass outages, security breaches and loss of revenue."
Software players are typically hesitant to issue patches out of their regular schedules. Last month, Adobe Systems' Brad Arkin said in a blog post that the company decided not to produce an out-of-cycle fix for a vulnerability in its Acrobat and Reader, as it would take about two to three weeks to code and "negatively impact the timing of the next quarterly security update" scheduled for Jan. 12.
However, Microsoft last July deviated from its monthly Patch Tuesday to release an emergency patch for a critical vulnerability in Internet Explorer and a less severe one in Visual Studio.
Verizon's Goudie said the regular patching cycle has its merits to organizations--offering them "predictability" and ensuring there are available systems when needed.
"The current status quo of a fixed patch cycle is what all enterprises have based their patch management processes around, and a change to on-demand patching would require a major retooling for many enterprises," he pointed out.
IDC's Chan noted that patching schedules may not be an issue in future. "Software are surfacing with capabilities to self-repair, automatically patching errors in deployed software."
In the "near future", the research firm expects more patching systems and processes to become more intelligent and real-time in fixing software, he said.
READ MORE - 5 must-haves in patch management

Data leakage prevention still lacks bite

Despite the proliferation of security products, the quantity and magnitude of high-profile data leaks continue to rise. With this increase in information leakage comes a higher data breach cost.
According to a 2009 Ponemon Institute study, data breach incidents cost U.S. companies US$202 per compromised customer record in 2008, compared to US$197 in 2007. The average total per-incident cost in 2008 was US$6.7 million, compared US$6.3 million in 2007.
Even though data leakage has become a mainstream problem, the number of companies that have bought and adopted data leakage prevention (DLP) technologies remains small. Is DLP at a standstill or is it at crossroads?
Not just about infrastructure
There is a huge misconception that organizations can just add another piece of infrastructure to solve data leakage problems. Securing the network connection is an important step, of course, but this usually does not result in truly improving the overall security of what enterprises care about--sensitive data.
In fact, today's networks and applications were designed without taking into account the overall security of information. The connectivity of "all" business networks to each other through the Internet was never in fact designed with that in mind--it just happened.
As such, the infrastructure actually has no context for what the exchange of information means, leaving many gaps in network security.
In our highly connected world, multiple applications can access the same data. It is not surprising if the number of entry and exit points of an organization's network far exceeds the number of machines or even users.
Just take a look around us. There are laptops, servers, smart phones, mobile phones, social networks--all these are potential leakage points. Also, the lack of awareness by computer users often results in important information "leaving" the organization.
Compounding this problem is that personal information is usually stored on multiple networks running multiple policies, without any control from the information "owner". Entities that hold consumer information are not necessarily savvy when it comes to IT or to information security issues. Their only objective is to satisfy regulations.
The number of access points is further increased when companies engage in contract work and offshoring activities. In the 2009 Ponemon Institute study, third-party organizations accounted for more than 44 percent of data breach cases in 2008, and are also the most costly form of data breaches as a result of additional investigation and consulting fees.
Closing one exit point does not provide any value. The organization needs an overall strategy to protect the organization.
Regulations, policies, compliance
To manage the current situation, more regulations are being implemented. As we need ways to measure progress, compliance is a good way to achieve that.
But, this also means that solving the compliance problem is usually the driver for businesses buying security products. Unfortunately, that drives vendors to satisfy regulations rather than actually attempt to solve the problem.
In reality, many PCI-certified businesses are still getting successfully attacked. Hence, implementing regulations can be more expensive than solving the actual problem at hand.
As companies look to address these challenges, they turn to an array of DLP vendors and technologies, only to face exponentially more complexity at every phase of security--from selection to integration to management.
Here's a quick look at some types of technologies that are presently used:
•  Network DLP
Network DLP offerings are enterprise tools that manage access to important information. They usually crawl the network and "fingerprint" files and records that contain information of a particular type.
These products are usually able to detect if a particular important file or dataset is being transferred somewhere. However, detecting whether the file is being sent to an authorized location is particularly difficult, unless the DLP application has knowledge of many applications and their expected behaviors.
Network DLP technologies also do not have the capacity to monitor data that are locally managed at an endpoint, such as personal e-mail and mobile device. So, it is difficult to thwart insider threats if the endpoints are not guarded.
At the same time, encrypted data can cause problems if the enterprise does not manage the encryption well.
•  Endpoint DLP
These are agents or client software that reside at endpoints, such as mobile devices, computer ports, personal e-mail and instant messaging. An agent running on a particular PC can detect if an important file is being transferred out. However, managing the same information across the corporate network or other networks, requires additional infrastructure.
It is important to note that policies at the servers and endpoints are usually different in nature and need different ways to be managed. While the industry has been trying to come up with standards in the policy domain, this may take a long time. As such, different vendors and different departments are likely to still come up with different non-interoperable policies.
•  Embedded DLP
Can companies deploy a "partial" solution, where only applications that they decide are critical are monitored? And can they have DLP integrated into certain applications so they do not have to build a separate management infrastructure?
The answer lies in embedded DLP.
DLP should be embedded inside the pipe, such as the e-mail system. If companies build applications in a way where DLP is in the middle of the application, they can then implement the policy correctly because they are directly addressing the data flow. The content that is supposed to travel between networks knows where it is supposed to go and who should be given access to the data.
Ultimately, if companies truly want a security angle, it may be better to implement DLP piecemeal in each application.
Overall security strategy
Enterprises need to think about security as a suite of business processes, rather than about implementing a particular technology or two. What is truly needed is a suite of features in existing technologies that provide elements of DLP to help address the problem incrementally over time.
In addition, the management of the chosen technologies requires knowledge and effort.
DLP will not be successful until the industry knows how to describe policies for content independent of where and how the content is accessed.
One business partner can enforce its policy on a transaction as it travels between other partners. A policy can include aspects of access to the information by the partner or others. Embedding policies in content is perhaps the only way to enforce policies across company boundaries.
Enforcing policies will also continue to be difficult in different verticals.
There are regulation-based policies for which businesses such as retail firms, need to ensure credit card and other account data are not leaked. And there are also enterprise policies that are specific to a business, such as a chip manufacturer.
The challenge for security services providers is to provide customers with a good set of templates to build upon, rather than push a product out of the box. Deep experience of an industry segment will also help a lot.
DLP is a very important aspect of managing enterprise networks that host important data. It is possible to focus on critical applications and deploy aspects of DLP without affecting the management of other applications or other parts of the network.
Current technologies and products are going in the right direction, but the overall management remains a challenge.
Efforts need to be coordinated among the various industry groups to streamline and standardize security support. While we are making progress as an industry, a lot more is needed.
READ MORE - Data leakage prevention still lacks bite

Social media comes of age, profitability

It may be all of only 140 characters, but microblogging site Twitter took off to greater heights last year, alongside another social network bigwig Facebook.
The past 12 months also saw smartphones soaking in the limelight as the segment bucked an overall handset market slide to register a 27 percent increase in worldwide shipment.
Here's a look at five keywords that defined the tech industry in 2009.
Twitter
Love it or hate it, the real-time phenomenon known simply as Twitter is here to stay, particularly after the microblogging site has demonstrated such significant global reach and influence.
The Iranian presidential election held in June is a case in point of how Twitter can be used to rally the masses for a cause. When incumbent President Mahmoud Ahmadinejad was proclaimed the winner on Jun. 12, Iranian supporters for opposing candidate Mir-Hossein Mosavi took to the streets--and Twitter--to voice their discontent.
Twitter users changed their profile backgrounds and avatars to green--Mosavi's party colors--in protest of suspected fraud and manipulation of the polling results. A CNN report noted that the post-election street protests eventually came to be known as the "Twitter Revolution".
More significantly, it showed the world how real-time news updates by civilian journalists can and have trumped traditional media outlets with the speed of information disseminated to a wide audience. According to a Mashable blog post, "CNN's complete lack of coverage early in the Iranian election crisis spurred a vicious backlash from Twitter users".
As for the business itself, 2009 saw the microblogging site finally achieve profitability. Since its creation three years ago, the startup has focused more on adding to its subscriber base rather than generating revenue.
But after inking US$25 million worth of deals with search giants Google and Microsoft in October 2009, the social media company now has the funds to further expand its business. Twitter's Internet search deals will make Tweets published on the site searchable on the other two companies' search platforms, according to a Bloomberg Businessweek report. This imbues a real-time element to search results. However, not to be outdone, Yahoo announced its own real-time search plans days later.
Facebook
Another social-networking site also made waves last year. In fact, some may argue that it made more of an impact than Twitter did. That site is Facebook.
According to a Mashable blog post, Facebook grew from 150 million users in January to about 350 million users by end of last year--no small feat considering Twitter garnered a 2009 forecast of 18 million U.S. users by market research firm eMarketer.
Furthermore, Facebook is among the most searched for word in 2009, according to a research done by Google. In the search giant's annual Zeitgeist report that looks into queries typed in by users over the world, the term "Facebook" was listed as second in the "Fastest rising (Global)" category; fellow social media site "Twitter" came in at fourth.
There was a similar trend in the Zeitgeist breakdown for Singapore. Facebook topped the "Most Popular Searches of 2009" segment, while "Facebook log-in" came in at seventh. In comparison, Twitter came in at only seventh in the "Fastest Rising Searches of 2009" category.
Recently, the social-networking site also stated its intentions to become a digital identity repository, or what Facebook's vice president for products, Chris Cox, termed an "identity medium" for transactions between individuals and businesses inhabiting the Web.
Already, its universal login product, Facebook Connect, has linked up with more than 80,000 third-party sites such as Yahoo, Foursquare and Gowalla to provide data portability over the Internet. To date, more than 60 million users have signed up to use the service.
This move is expected to gain more traction in 2010, as Facebook starts to open up its valuable user database to advertisers. The Yahoo-Facebook Connect deal could be seen as a precursor of things to come as the search engine operator intends to tap the former's user data to place more relevant ads on its own Web site.
Smartphones
It was a year of apples and droids, too. In other words, smartphones.
While the mobile handset market sales fell six percent year-on-year from 305 million to 286 million units in the second quarter of 2009, according to research firm Gartner's figures, global smartphone sales bucked the trend. This increased by 27 percent from 32 million to 41 million units in the same time frame.
The mobile handset that captured the imagination of consumers for most of 2009 was Apple's iPhone as it continued to build on its increasing popularity with the updated iPhone 3GS introduced in June.
More than just a phone, the iPhone is a multimedia platform that allows users to do anything. Activities can range from buying gifts online to paying bills, and from catching up with friends through social-networking sites to playing online games through applications downloaded from its App Store.
This had the industry buzzing as software developers, trying to ride out a year when venture funds were drying up, sought to write a "killer app" that would be downloaded widely to bring in the money.
Later in the year, smartphones running Google's open source operating system, Android, started to gain much market traction as U.S. wireless operators such as Verizon and Sprint Nextel collaborated with device manufacturers like Motorola and HTC to launch Android phones.
Besides giving consumers an alternative choice, the Android Market proved a welcomed addition for software developers struggling to get their applications approved by Apple, whose opaque approval process was widely criticized last year.
Furthermore, the market is anticipating the imminent launch of Google's own Android OS-based smartphone, the "Nexus One", to further shake up the smartphone industry.
Cloud computing
Most industry insiders are predicting 2010 to be the breakthrough year for cloud computing.
But the technology was already heavily looked into by enterprises in 2009, as businesses sought to cut operational costs and increase productivity by maximizing their IT resources during the recession and ensuing recovery period.
A survey conducted last June by AppLabs showed that 50 percent of 104 Global 2000 companies were already using or are planning to deploy cloud computing services into their IT infrastructure in the next 12 months.
According to research firm Gartner's definition, cloud computing services are defined as "service-based, scalable and elastic, shared, metered by use, and delivered using Internet technologies".
However, many CIOs and small and midsize businesses (SMBs) are still waiting for cloud service providers to improve on their security and compliance and to establish a common standard in the industry before they jump on the bandwagon.
Geo-based applications
Everyone, from Apple and Google to young startups such as Foursquare, Gowalla and Brightkite, is looking to tap on the now ubiquitous global positioning system (GPS) to serve up geolocation-based service applications this year.
Google, for one, launched its Google Latitude service last February to allow mobile phone users to keep in touch with their friends and loved ones, wherever they may be based.
However, speculation became rife that Apple was trying to enter the geolocation market, too, after it bought over maps application programming interface (API) company, Placebase, in July. The Cupertino company also blocked Google's application to be made available as a native iPhone app on its Apple App Store. Instead, it requested that the search giant run this as a Web application so as not to confuse phone users with the existing Google Maps app, fueling suspicion that Apple was developing similar software.
Meanwhile, mobile startups such as Gowalla, Foursquare and Brightkite are emerging as geolocation-based service software developers touted to become the next Twitter or Facebook phenomenon. In fact, founder and CEO of Mashable, Peter Cashmore, had earlier predicted that Foursquare would be 2010's social media "poster boy".
Twitter, though, has been making strong inroads to ensure it continues to dominate the social media scene and trump the startups. To that end, the microblogging site in December bought over Mixer Labs, the company which created the GeoAPI location service for developers building applications atop Twitter.
Utilizing this new capability to include geolocation information to its users' Tweets, Twitter might be well poised to retain market share and possibly beat back these new upstarts in the social media sphere.


READ MORE - Social media comes of age, profitability