Phishers ride on financial crisis theme
Posted by
Sinlung
Phishing attacks have doubled in the last month, with phishers riding on the downturn in the economy to pose as financial institutions, said Symantec.
According to the antivirus company's latest MessageLabs Intelligence Report, the recession theme has seen a revival in the past month, where spam is concerned.
"At a time when concerned consumers may not be surprised to hear from their banks, phishing attacks have risen to one in 190 e-mail messages, from one in 396 in January 2009," said the report.
"Recession spam" messages have also surfaced, carrying text strings such as "money is tight, times are hard". February saw the reappearance of search engine redirects referencing the financial crisis, for the first time in over a year, said Symantec.
Overall, however, spam declined by 1.3 percent to 73.3 percent of all e-mail messages in February. The report added this includes a spike in levels hitting 79.5 percent at the start of the month, due in part to Valentine's Day-themed spam.
Symantec said the vast majority of such spam originated from the Cutwail (Pandex) botnet, which pushed out an estimated 7 billion Valentine's Day-themed messages each day.
Paul Wood, MessageLabs intelligence senior analyst, Symantec said: "Although spam levels declined slightly this month, the level of activity around Valentine's themed spam reached unprecedented highs accounting for 9 percent of all spam messages."
The report said all countries saw a slight dip in spam levels this month with the rate in the United States falling to 57 percent, 52.6 percent in Canada and 66.6 percent in the United Kingdom. Germany's rate was 69.1 percent and 67.4 percent in the Netherlands.
In Australia, this was 68.5 percent, 72.8 percent in Hong Kong, 67.8 percent in China and 65.6 percent in Japan.
India was ranked at the top position for viruses, however, with virus activity rising by 0.16 percent to hit one infected e-mail in 197.4 messages.
READ MORE - Phishers ride on financial crisis theme
According to the antivirus company's latest MessageLabs Intelligence Report, the recession theme has seen a revival in the past month, where spam is concerned.
"At a time when concerned consumers may not be surprised to hear from their banks, phishing attacks have risen to one in 190 e-mail messages, from one in 396 in January 2009," said the report.
"Recession spam" messages have also surfaced, carrying text strings such as "money is tight, times are hard". February saw the reappearance of search engine redirects referencing the financial crisis, for the first time in over a year, said Symantec.
Overall, however, spam declined by 1.3 percent to 73.3 percent of all e-mail messages in February. The report added this includes a spike in levels hitting 79.5 percent at the start of the month, due in part to Valentine's Day-themed spam.
Symantec said the vast majority of such spam originated from the Cutwail (Pandex) botnet, which pushed out an estimated 7 billion Valentine's Day-themed messages each day.
Paul Wood, MessageLabs intelligence senior analyst, Symantec said: "Although spam levels declined slightly this month, the level of activity around Valentine's themed spam reached unprecedented highs accounting for 9 percent of all spam messages."
The report said all countries saw a slight dip in spam levels this month with the rate in the United States falling to 57 percent, 52.6 percent in Canada and 66.6 percent in the United Kingdom. Germany's rate was 69.1 percent and 67.4 percent in the Netherlands.
In Australia, this was 68.5 percent, 72.8 percent in Hong Kong, 67.8 percent in China and 65.6 percent in Japan.
India was ranked at the top position for viruses, however, with virus activity rising by 0.16 percent to hit one infected e-mail in 197.4 messages.
Ovum: Treat cloud as traditional asset
Posted by
Sinlung
Don't treat cloud services any differently from other IT assets in an organization, advised Ovum.
Following the recent crash of social bookmarking site, Ma.gnolia, which wiped out all of its users' data, the analyst firm released a statement advising organizations against over-trusting the cloud.
David Mitchell, Ovum senior vice president of IT research, said: "CIOs should treat cloud services the same way as they treat other IT assets that they use.
"They need to ensure that they have effective backup and recovery plans for the data held in cloud services, in the same way as they would for on-premise services--whether those backup services are provided by the cloud provider or by the CIO."
Mitchell said unless it is acceptable for cloud services to have unknown service levels and for data to be lost, customers of cloud providers should treat these services with "the same seriousness as they would mainstream IT purchases".
The cloud is not invincible
The mistake of many organizations is making the assumption that cloud-based services have access to "near-infinite pools of computing resources and that these resources are operated to 'best in class' standards", said Mitchell.
The reality, however, is cloud providers have the same investment criteria as traditional businesses, and rely on revenues to justify their capital investments in infrastructure outlay, he highlighted.
Beyond this, Mitchell offered advice to CIOs looking to integrate cloud services into their infrastructure. In order to achieve richer functionality, CIOs should place emphasis on interoperability.
This will also allow different services to work together, potentially allowing for a support network to be built--two storage cloud services could mirror each other, for example.
"Ultimately, interoperability around the cloud needs to be taken more seriously and offer progressively richer functionality, so that cloud-to-cloud and cloud-to-on-premise integration is seamless and can become part of the standard corporate architecture," said Mitchell.
READ MORE - Ovum: Treat cloud as traditional asset
Following the recent crash of social bookmarking site, Ma.gnolia, which wiped out all of its users' data, the analyst firm released a statement advising organizations against over-trusting the cloud.
David Mitchell, Ovum senior vice president of IT research, said: "CIOs should treat cloud services the same way as they treat other IT assets that they use.
"They need to ensure that they have effective backup and recovery plans for the data held in cloud services, in the same way as they would for on-premise services--whether those backup services are provided by the cloud provider or by the CIO."
Mitchell said unless it is acceptable for cloud services to have unknown service levels and for data to be lost, customers of cloud providers should treat these services with "the same seriousness as they would mainstream IT purchases".
The cloud is not invincible
The mistake of many organizations is making the assumption that cloud-based services have access to "near-infinite pools of computing resources and that these resources are operated to 'best in class' standards", said Mitchell.
The reality, however, is cloud providers have the same investment criteria as traditional businesses, and rely on revenues to justify their capital investments in infrastructure outlay, he highlighted.
Beyond this, Mitchell offered advice to CIOs looking to integrate cloud services into their infrastructure. In order to achieve richer functionality, CIOs should place emphasis on interoperability.
This will also allow different services to work together, potentially allowing for a support network to be built--two storage cloud services could mirror each other, for example.
"Ultimately, interoperability around the cloud needs to be taken more seriously and offer progressively richer functionality, so that cloud-to-cloud and cloud-to-on-premise integration is seamless and can become part of the standard corporate architecture," said Mitchell.
More contract IT jobs expected in S'pore
Posted by
Sinlung
SINGAPORE--A significantly higher number of jobs in the country will be offered on a contract basis this year, as organizations look for flexible employment terms to circumvent headcount restrictions, a human resource survey revealed.
According to the Robert Walters Singapore Salary Survey 2009 released Wednesday, many roles that were traditionally offered as permanent jobs will be converted into contract positions.
"With organizations looking to justify each and every permanent headcount, we would expect the majority of requirements for project initiatives to be offered as contract opportunities," the survey noted.
The human resource agency, which offers recruitment services for mid- to senior-level IT positions, anticipates that candidates applying for these jobs will warm up to contract roles as the perceived job security that permanent positions offers, lessens during the recession.
On whether this employment practice will continue after the recession, Adam Bowden, manager of the recruiter's IT commerce division, noted that it is too soon to determine either way.
However, Bowden told ZDNet Asia in an interview, the recession could be "a launch pad for continued acceptance" of contract employment after the economy turns for the better.
According to the Robert Walters report, demand for candidates with experience in project management, service delivery, business analysis, network implementation and infrastructure consolidation, will be particularly strong throughout 2009.
Bowden said projects resulting from the S$1.14 billion (US$749 million) worth of infocomm tenders that the Singapore government announced last year, will also boost the popularity of contract jobs in the island-state.
"Also, when the Youth Olympic Games comes to Singapore in 2010, contract-based tech skills will be required for that as well," Bowden said.
Slower hiring rate
Not surprisingly, Robert Walters expects slower recruitment this year in Singapore's financial services sector, where salaries are likely to be static or to rise only moderately.
Despite the decline, the recruitment firm anticipates continued demand by the sector for candidates with experience in service delivery, project management, risk and regulatory applications, systems architecture, application support and infrastructure operations.
As organizations in the manufacturing, logistics, pharmaceuticals and petrochemical sectors move toward outsourcing non-core IT functions, Robert Walters expects to see headcount reductions in this area, particularly at the junior to mid-level.
For job candidates, the silver lining is that as outsourcing becomes more prevalent, user organizations will look for candidates with experience in vendor, contract and relationship management, when recruiting mid- to senior-level IT professionals in 2009.
According to the survey, there will be high demand for experienced SAP consultants, service delivery managers and quality assurance professionals.
READ MORE - More contract IT jobs expected in S'pore
According to the Robert Walters Singapore Salary Survey 2009 released Wednesday, many roles that were traditionally offered as permanent jobs will be converted into contract positions.
"With organizations looking to justify each and every permanent headcount, we would expect the majority of requirements for project initiatives to be offered as contract opportunities," the survey noted.
The human resource agency, which offers recruitment services for mid- to senior-level IT positions, anticipates that candidates applying for these jobs will warm up to contract roles as the perceived job security that permanent positions offers, lessens during the recession.
On whether this employment practice will continue after the recession, Adam Bowden, manager of the recruiter's IT commerce division, noted that it is too soon to determine either way.
However, Bowden told ZDNet Asia in an interview, the recession could be "a launch pad for continued acceptance" of contract employment after the economy turns for the better.
According to the Robert Walters report, demand for candidates with experience in project management, service delivery, business analysis, network implementation and infrastructure consolidation, will be particularly strong throughout 2009.
Bowden said projects resulting from the S$1.14 billion (US$749 million) worth of infocomm tenders that the Singapore government announced last year, will also boost the popularity of contract jobs in the island-state.
"Also, when the Youth Olympic Games comes to Singapore in 2010, contract-based tech skills will be required for that as well," Bowden said.
Slower hiring rate
Not surprisingly, Robert Walters expects slower recruitment this year in Singapore's financial services sector, where salaries are likely to be static or to rise only moderately.
Despite the decline, the recruitment firm anticipates continued demand by the sector for candidates with experience in service delivery, project management, risk and regulatory applications, systems architecture, application support and infrastructure operations.
As organizations in the manufacturing, logistics, pharmaceuticals and petrochemical sectors move toward outsourcing non-core IT functions, Robert Walters expects to see headcount reductions in this area, particularly at the junior to mid-level.
For job candidates, the silver lining is that as outsourcing becomes more prevalent, user organizations will look for candidates with experience in vendor, contract and relationship management, when recruiting mid- to senior-level IT professionals in 2009.
According to the survey, there will be high demand for experienced SAP consultants, service delivery managers and quality assurance professionals.
DNS Still Less than Secure
Posted by
Sinlung
Exploits for a serious cache-poisoning vulnerability discovered in the Domain Name System (DNS) last year have begun to appear in the wild, and they have made security researcher Dan Kaminsky a believer in DNS Security Extensions (DNSSEC).
"I've never been a DNSSEC supporter," Kaminsky said today at the Black Hat Federal security conference being held in Arlington, Va. "At best, I've been neutral on the technology."
Kaminsky, director of penetration testing at IOActive Inc., last year discovered the vulnerability in the DNS that underpins the Internet and helped to engineer the release of a patch for it. The patch, which introduced more port randomization into DNS servers, was merely a quick fix and Kaminsky said he has come to the conclusion that no security technology except DNSSEC can scale well enough to fix the problem.
"DNS scales like absolutely nothing else," he said. That is why the technology has been so long-lived and ubiquitous. "DNS is the only way to scale systems across organizational boundaries." And that means it extends its unsecurity to everything it touches -- which is nearly everything on the Internet.
The problem with DNSSEC is that it is difficult to deploy and manage, and it has been adopted only slowly and reluctantly. The federal government is leading the way by deploying it in the top-level .gov domain this year, but the General Services Administration, which is spearheading the move, said earlier this month that implementation will be delayed by one month. The .gov registry, the registrar and the .gov DNS servers were supposed to have digitally signed by the end of January, but GSA officials said it had identified an additional feature that is needed and that they expect to deploy DNSSEC by the end of February.
"We have got to make DNSSEC deployable," Kaminsky said. "It has to be so much simpler than it is today."
DNS is a hierarchical system that translates written domain names such as those in URLs and e-mail addresses into IP addresses. The vulnerability discovered by Kaminsky could allow cached data in DNS name servers to be poisoned and Web requests to be misdirected, sending users to unknown Web sites. Web poisoning exploits already were known, but because the new vulnerability is in the basic design of the protocol itself it is potentially more dangerous that previous problems.
The vulnerability involves a weakness in the transaction ID used in DNS queries. Currently, replies to a DNS query have to contain the proper transaction ID, which is chosen randomly from 65,000 values. "For undisclosed reasons, 65,000 is just not enough," Kaminsky said when the vulnerability was announced. "We needed more randomization."
That was provided with the patch released in July 2008. Adoption of the patch has been good, but not good enough. Kaminsky said 66 percent of name servers had been patched after one month, much better than the average of 50 percent in a year for most patches. Today, an estimated 75 percent of name servers have been patched, but that means that 25 percent still could be vulnerable.
And the port randomization patch was never meant to be a permanent fix. "We have bought you as much time as possible," Kaminsky said when it was released.
He said today that monitoring by a team from Georgia Tech University had uncovered evidence of exploits for the vulnerability, with occasional brief spikes in misdirected queries.
"Detecting poisoning is difficult," Kaminsky said. "The evidence is written in invisible ink," and the poisoners are being stealthy. "What seems to be happening is that there is testing going on."
Indications are that 1 to 3 percent of unpatched name servers have been poisoned. That is a small percentage, but there currently are an estimated 11.9 million name servers facing the Internet, and nearly 3 million of them probably are not patched. That means that as many as 89,250 servers could have been quietly poisoned.
DNSSEC is a security protocol that allows DNS queries and answers to be digitally signed and authenticated. With DNSSEC, answers to requests are digitally signed to protect clients from forged DNS data. The protocol provides authentication of the origin of DNS data, data integrity, and authenticated denial of existence for an address that cannot be found. To date, only a statistically insignificant number of zones are believed to be using DNSSEC.
Some appliances are available that automate DNSSEC processes so that manual signing and updating are not needed. Servers generate key material and sign automatically so that administrators do not have to manage the process. However, it still is not simple enough, Kaminsky said.
"Substantial work needs to be done with DNSSEC," he said. "But I think it can be done. There is no reason to think it is impossible."
READ MORE - DNS Still Less than Secure
"I've never been a DNSSEC supporter," Kaminsky said today at the Black Hat Federal security conference being held in Arlington, Va. "At best, I've been neutral on the technology."
Kaminsky, director of penetration testing at IOActive Inc., last year discovered the vulnerability in the DNS that underpins the Internet and helped to engineer the release of a patch for it. The patch, which introduced more port randomization into DNS servers, was merely a quick fix and Kaminsky said he has come to the conclusion that no security technology except DNSSEC can scale well enough to fix the problem.
"DNS scales like absolutely nothing else," he said. That is why the technology has been so long-lived and ubiquitous. "DNS is the only way to scale systems across organizational boundaries." And that means it extends its unsecurity to everything it touches -- which is nearly everything on the Internet.
The problem with DNSSEC is that it is difficult to deploy and manage, and it has been adopted only slowly and reluctantly. The federal government is leading the way by deploying it in the top-level .gov domain this year, but the General Services Administration, which is spearheading the move, said earlier this month that implementation will be delayed by one month. The .gov registry, the registrar and the .gov DNS servers were supposed to have digitally signed by the end of January, but GSA officials said it had identified an additional feature that is needed and that they expect to deploy DNSSEC by the end of February.
"We have got to make DNSSEC deployable," Kaminsky said. "It has to be so much simpler than it is today."
DNS is a hierarchical system that translates written domain names such as those in URLs and e-mail addresses into IP addresses. The vulnerability discovered by Kaminsky could allow cached data in DNS name servers to be poisoned and Web requests to be misdirected, sending users to unknown Web sites. Web poisoning exploits already were known, but because the new vulnerability is in the basic design of the protocol itself it is potentially more dangerous that previous problems.
The vulnerability involves a weakness in the transaction ID used in DNS queries. Currently, replies to a DNS query have to contain the proper transaction ID, which is chosen randomly from 65,000 values. "For undisclosed reasons, 65,000 is just not enough," Kaminsky said when the vulnerability was announced. "We needed more randomization."
That was provided with the patch released in July 2008. Adoption of the patch has been good, but not good enough. Kaminsky said 66 percent of name servers had been patched after one month, much better than the average of 50 percent in a year for most patches. Today, an estimated 75 percent of name servers have been patched, but that means that 25 percent still could be vulnerable.
And the port randomization patch was never meant to be a permanent fix. "We have bought you as much time as possible," Kaminsky said when it was released.
He said today that monitoring by a team from Georgia Tech University had uncovered evidence of exploits for the vulnerability, with occasional brief spikes in misdirected queries.
"Detecting poisoning is difficult," Kaminsky said. "The evidence is written in invisible ink," and the poisoners are being stealthy. "What seems to be happening is that there is testing going on."
Indications are that 1 to 3 percent of unpatched name servers have been poisoned. That is a small percentage, but there currently are an estimated 11.9 million name servers facing the Internet, and nearly 3 million of them probably are not patched. That means that as many as 89,250 servers could have been quietly poisoned.
DNSSEC is a security protocol that allows DNS queries and answers to be digitally signed and authenticated. With DNSSEC, answers to requests are digitally signed to protect clients from forged DNS data. The protocol provides authentication of the origin of DNS data, data integrity, and authenticated denial of existence for an address that cannot be found. To date, only a statistically insignificant number of zones are believed to be using DNSSEC.
Some appliances are available that automate DNSSEC processes so that manual signing and updating are not needed. Servers generate key material and sign automatically so that administrators do not have to manage the process. However, it still is not simple enough, Kaminsky said.
"Substantial work needs to be done with DNSSEC," he said. "But I think it can be done. There is no reason to think it is impossible."
Having a cow over Gmail just misses the point
Posted by
Sinlung
By Charles Cooper
READ MORE - Having a cow over Gmail just misses the point
A big outage at Google Tuesday. Things go dark early while most of the U.S. is sleeping. Still, the Internet is without borders and so the glitch leaves millions of people who use Google Web mail and Google Apps, high and dry.
It was mild melodrama for a few hours but things returned to normal after a few hours. It's still unclear what happened, though Google says it's investigating the problem.
Truth be told, the walls of Jericho did not crumble, though the outage nonetheless triggered the (now thoroughly predictable) hand-wringing and bloviating from the usual cast of characters. Amusing to watch, but after this incident, there's also the wider context to consider.
Any outages are embarrassing. But while Gmail did crash a few times in 2008, this is the first time the service has gone down in quite a while. (As my colleague Stephen Shankland noted, Google extends a guarantee to corporate customers paying for any of its business Apps services, which rely on the cloud. The promise: they will be able to access Gmail at least 99.9 percent of the time every month. If not, Google pays them a penalty fee. So far Google says it hasn't fallen below that mark.)
If these sorts of outages occurred with more regularity, I suppose that would seriously retard cloud computing's growth. Google and Salesforce.com and Amazon and any other purveyors of cloud-based services obviously cringe when their connections fail. Not to underplay the anguish customers and vendors find themselves dealing with, but the real news here is how rare these cloud-computing outages have become.
A few years ago it seemed that eBay's Web site was seizing up all of the time. The reality was less severe but merchants and bidders would scream bloody murder. At the same time, eBay, Yahoo, Amazon, and Buy.com were dealing with repeated denial-of-service (DoS) attacks. Things got so bad that some even feared for the future of e-commerce.
We now know how the story turned out. Fact is that there are no 100 percent guarantees anymore, not in a world in which applications increasingly get hosted on the Internet. When things go bump in the night, as they inevitably will, there is going to be a commotion, albeit a temporary one. Get over it, already.
This is computing, after all.
It was mild melodrama for a few hours but things returned to normal after a few hours. It's still unclear what happened, though Google says it's investigating the problem.
Truth be told, the walls of Jericho did not crumble, though the outage nonetheless triggered the (now thoroughly predictable) hand-wringing and bloviating from the usual cast of characters. Amusing to watch, but after this incident, there's also the wider context to consider.
Any outages are embarrassing. But while Gmail did crash a few times in 2008, this is the first time the service has gone down in quite a while. (As my colleague Stephen Shankland noted, Google extends a guarantee to corporate customers paying for any of its business Apps services, which rely on the cloud. The promise: they will be able to access Gmail at least 99.9 percent of the time every month. If not, Google pays them a penalty fee. So far Google says it hasn't fallen below that mark.)
If these sorts of outages occurred with more regularity, I suppose that would seriously retard cloud computing's growth. Google and Salesforce.com and Amazon and any other purveyors of cloud-based services obviously cringe when their connections fail. Not to underplay the anguish customers and vendors find themselves dealing with, but the real news here is how rare these cloud-computing outages have become.
A few years ago it seemed that eBay's Web site was seizing up all of the time. The reality was less severe but merchants and bidders would scream bloody murder. At the same time, eBay, Yahoo, Amazon, and Buy.com were dealing with repeated denial-of-service (DoS) attacks. Things got so bad that some even feared for the future of e-commerce.
We now know how the story turned out. Fact is that there are no 100 percent guarantees anymore, not in a world in which applications increasingly get hosted on the Internet. When things go bump in the night, as they inevitably will, there is going to be a commotion, albeit a temporary one. Get over it, already.
This is computing, after all.
New variant of Conficker worm circulates
Posted by
Sinlung
A new variant of the Conficker Internet worm is circulating that opens up a backdoor that could allow an attacker to distribute malware to infected machines, the US-CERT organization warned Monday.
The new Conficker/Downadup worm, dubbed "Conficker B++", uses a new backdoor with "auto-update" functionality, CERT said in an advisory.
Microsoft says there is no indication that systems infected with previous variants of Conficker can automatically be re-infected with the new variant, CERT said.
Previous versions of Conficker took action to prevent further exploitation of the vulnerability, Microsoft said in an advisory of its own.
"We've discovered that the new variant no longer patches netapi32.dll against all attempts to exploit it. Instead it now checks for a specific pattern in the incoming shellcode and for a URL to an updated payload," said Microsoft, which is offering a US$250,000 reward to stop the Conficker worm. "The payload only executes if it is successfully validated by the malware. However, there doesn't appear to be an easy way for the authors to upgrade the existing Conficker network to the new variant."
The worm, which has been around since last year, spreads through a hole in Windows systems, exploiting a vulnerability that Microsoft patched in October.
Conficker also spreads via removable storage devices like USB drives, and network shares by guessing passwords and user names.
Meanwhile, the previous versions of Conficker have been busy. Conficker.A has affected more than 4.7 million IP addresses, while its successor, Conficker.B, has affected 6.7 million IP addresses, with infected hosts totaling fewer than 4 million computers for both, according to a technical report by SRI International.
READ MORE - New variant of Conficker worm circulates
The new Conficker/Downadup worm, dubbed "Conficker B++", uses a new backdoor with "auto-update" functionality, CERT said in an advisory.
Microsoft says there is no indication that systems infected with previous variants of Conficker can automatically be re-infected with the new variant, CERT said.
Previous versions of Conficker took action to prevent further exploitation of the vulnerability, Microsoft said in an advisory of its own.
"We've discovered that the new variant no longer patches netapi32.dll against all attempts to exploit it. Instead it now checks for a specific pattern in the incoming shellcode and for a URL to an updated payload," said Microsoft, which is offering a US$250,000 reward to stop the Conficker worm. "The payload only executes if it is successfully validated by the malware. However, there doesn't appear to be an easy way for the authors to upgrade the existing Conficker network to the new variant."
The worm, which has been around since last year, spreads through a hole in Windows systems, exploiting a vulnerability that Microsoft patched in October.
Conficker also spreads via removable storage devices like USB drives, and network shares by guessing passwords and user names.
Meanwhile, the previous versions of Conficker have been busy. Conficker.A has affected more than 4.7 million IP addresses, while its successor, Conficker.B, has affected 6.7 million IP addresses, with infected hosts totaling fewer than 4 million computers for both, according to a technical report by SRI International.
Exiting workers taking confidential data with them
Posted by
Sinlung
As layoffs continue apace, a survey released on Monday shows what many companies fear--exiting workers are taking a lot more with them than just their personal plants and paperweights.
Of about 950 people who said they had lost or left their jobs during the last 12 months, nearly 60 percent admitted to taking confidential company information with them, including customer contact lists and other data that could potentially end up in the hands of a competitor for the employee's next job stint.
"I don't think these people see themselves as being thieves or as stealing," said Larry Ponemon, founder of the Ponemon Institute, which conducted the online survey last month. "They feel they have a right to the information because they created it or it is useful to them and not useful to the employer."
The survey also found a correlation between people who took data they shouldn't have taken and their attitude towards the company they are leaving. More than 60 percent of those who stole confidential data also reported having an unfavorable view of the company. And nearly 80 percent said they took it without the employer's permission.
Most of the data takers (53 percent) said they downloaded the information onto a CD or DVD, while 42 percent put it on a USB drive and 38 percent sent it as attachments via e-mail, according to the survey.
The survey also found that many companies seem to be lax in protecting against data theft during layoffs. Eighty-two percent of the respondents said their employers did not perform an audit or review of documents before the employee headed out the door and 24 percent said they still had access to the corporate network after leaving the building.
The survey was commissioned by Symantec, which offers software that helps companies protect against data loss by indexing database and monitoring for patterns of word combinations that might be used by exiting employees to steal data. The Symantec software also can monitor outbound e-mail for confidential data and alert IT if large amounts of certain types of data, such as Social Security numbers, are being copied to removable storage devices.
READ MORE - Exiting workers taking confidential data with them
Of about 950 people who said they had lost or left their jobs during the last 12 months, nearly 60 percent admitted to taking confidential company information with them, including customer contact lists and other data that could potentially end up in the hands of a competitor for the employee's next job stint.
"I don't think these people see themselves as being thieves or as stealing," said Larry Ponemon, founder of the Ponemon Institute, which conducted the online survey last month. "They feel they have a right to the information because they created it or it is useful to them and not useful to the employer."
The survey also found a correlation between people who took data they shouldn't have taken and their attitude towards the company they are leaving. More than 60 percent of those who stole confidential data also reported having an unfavorable view of the company. And nearly 80 percent said they took it without the employer's permission.
Most of the data takers (53 percent) said they downloaded the information onto a CD or DVD, while 42 percent put it on a USB drive and 38 percent sent it as attachments via e-mail, according to the survey.
The survey also found that many companies seem to be lax in protecting against data theft during layoffs. Eighty-two percent of the respondents said their employers did not perform an audit or review of documents before the employee headed out the door and 24 percent said they still had access to the corporate network after leaving the building.
The survey was commissioned by Symantec, which offers software that helps companies protect against data loss by indexing database and monitoring for patterns of word combinations that might be used by exiting employees to steal data. The Symantec software also can monitor outbound e-mail for confidential data and alert IT if large amounts of certain types of data, such as Social Security numbers, are being copied to removable storage devices.
IBM highlights new threats
Posted by
Sinlung
IBM has pointed to new technology such as virtualization, cloud computing and SaaS (software as a service) as new vectors for security attacks, together helping to drive the security market in Thailand to around one billion baht (US$28 million) this year.
Over the next two to five years, emerging technologies ranging from virtualized environments, cloud-enabled services and SaaS will lead to an explosion in digital identities that need to be managed while the proliferation of mobile phones and PDAs as access points to the Internet will have a profound effect on enterprise security, according the IBM's GRC market manager, Marne E. Gordon.
Today's disruptive technologies increase complexity for organizations in terms of security management because they are moving targets. The situation is getting more complicated through the high number of mergers and acquisitions and outsourcing contracts that make businesses inter-dependable. Moreover, during the economic downturn, criminals will more aggressively pursue data crimes that can make a lot more money than traditional crimes. Today, there are more removable media and unstructured data which creates the chance for data leakage. There is also the rise of the trusted insider threat, especially in banking. However, before businesses invest in new technology, they should look back at basic, fundamental security infrastructure, Gordan said.
She added that regulation compliance is another force driving security trends, such as the Payment Card Industry (PCI) data standards, Sarbanes-Oxley which involves disclosure of financial accounting as well as the Health Insurance Portability and Accountability Act (HIPAA).
She cited Harley-Davidson as an example that has a variety of businesses that all have to comply with many regulations. More and more big enterprises are now implementing a single compliance policy that combines these different siloed compliance projects into one to provide a holistic view to executives and reduce redundancy.
Last year, IBM globally spent US$1.5 billion on security and created its security framework that identifies five key security areas: People and identity, data and information, application and process, network, server and end-point and physical infrastructure. The company approach is to strategically manage this risk end to end across the organization.
This will allow businesses to better understand and prioritize risks and vulnerabilities based on their potential to disrupt critical business processes.
For Thailand, security is a growing area made up of good governance, identity management, business continuity and managed security services. In particular, Thai-based businesses, even SMEs, need to comply with the Computer Misuse Act, which will bring the market to around one billion baht (US$28 million) this year, IBM Thailand GTS country executive Dhanawat Suthumpun added.
Microsoft working on ultra-secure browser
Posted by
Sinlung
Microsoft staff are working on an internet browser designed to be more secure than either Internet Explorer or any of its rivals. The project is still at the prototype stage and the browser still needs work to improve sluggish performance.
The researchers have just published a paper outlining the project, code-named Gazelle. It’s clearly not the major focus of Microsoft’s development as the researchers criticize the way Internet Explorer works.
Gazelle takes a key security features from IE8 and Google’s Chrome and takes it to the next level. Both those browsers aim to reduce security problems by changing the way computers handle situations where a user has multiple sites open in one browser using tabs.
The two systems both treat each tab as if it were a completely separate process in Windows. That not only reduces the likelihood of one frozen tab crashing the entire browser, but reduced the chances of security issues caused by secure and compromised Web pages running at the same time.
In Gazelle, every element of a Web page such as frames or plug-ins (for example streaming videos or Flash animations) is treated as a separate process. The browser also runs with its own kernel, effectively making it a separate operating system and allowing it to act more intelligently.
The downside of this system is that plug-ins have to be rewritten to work within the ‘one element/one process’ system. That could create a chicken and egg situation with independent plug-in makers unwilling to recode their products until such a browser was popular, and users unwilling to use the browser until plug-ins worked.
The researchers tested a prototype on the 20 busiest Web sites and found 19 worked largely as intended. However, the browser took almost twice as long as Internet Explorer 7 to load major Web sites and used considerably more memory just to display a simple page such as the Google.com homepage.
READ MORE - Microsoft working on ultra-secure browser
The researchers have just published a paper outlining the project, code-named Gazelle. It’s clearly not the major focus of Microsoft’s development as the researchers criticize the way Internet Explorer works.
Gazelle takes a key security features from IE8 and Google’s Chrome and takes it to the next level. Both those browsers aim to reduce security problems by changing the way computers handle situations where a user has multiple sites open in one browser using tabs.
The two systems both treat each tab as if it were a completely separate process in Windows. That not only reduces the likelihood of one frozen tab crashing the entire browser, but reduced the chances of security issues caused by secure and compromised Web pages running at the same time.
In Gazelle, every element of a Web page such as frames or plug-ins (for example streaming videos or Flash animations) is treated as a separate process. The browser also runs with its own kernel, effectively making it a separate operating system and allowing it to act more intelligently.
The downside of this system is that plug-ins have to be rewritten to work within the ‘one element/one process’ system. That could create a chicken and egg situation with independent plug-in makers unwilling to recode their products until such a browser was popular, and users unwilling to use the browser until plug-ins worked.
The researchers tested a prototype on the 20 busiest Web sites and found 19 worked largely as intended. However, the browser took almost twice as long as Internet Explorer 7 to load major Web sites and used considerably more memory just to display a simple page such as the Google.com homepage.
Microsoft aims to build a better thesaurus
Posted by
Sinlung
A team of researchers at Microsoft is looking to beat Roget at his own game.
Aiming to build a better thesaurus, the Writing Assistance project within Microsoft's research unit is tapping techniques developed to translate from one language to another.
Although thesauri are good at finding lots and lots of synonyms, they require the user to pick the right one because they aren't very good at understanding the context of what is being said. That's where the experience from doing machine translations comes in.
"We've taken the actual translation tables...and what we've done is we've taken those and said if a word in Chinese maps to two different English words maybe those two words are synonyms with some probability," said Christopher Brockett, a computational linguist and one of the Microsoft researchers leading the project.
The approach has two key benefits over a static thesaurus. First of all, the newer approach can do phrases, as opposed to single words. Also, it can draw on the context in which the phrase is used.
Brockett plans to show off a prototype of the tool at TechFest, Microsoft's annual internal science fair. It's just one of dozens of projects that will be shown as part of an effort to expose Microsoft's business units to the work being done in Microsoft's research labs.
TechFest is sort of like "The Dating Game" for Microsoft's research and product development arms. Research teams at Microsoft set up booths, somewhat like a high-school science fair, while product teams shuffle through looking for something that might give their efforts a leg up on the competition.
For the public, TechFest can also offer a glimpse at future product directions. For example, researcher Andy Wilson showed off a number of surface computing projects in the years leading up to the debut of Microsoft's Surface product.
As is the case with most of the projects, the thesaurus effort is still in its infancy.
"We're still working on the algorithms and how much work we give to the language pairs," Brockett said. "We have to get the quality up. There are usability issues that have to be looked into."
Over time, though, Brockett hopes the technique could be used to effectively translate whole sentences. Microsoft has a demonstration of that up on its Web site, but Brockett acknowledges such a treatment shows both the potential and the current limitations of the technology.
But would-be high-school plagiarists beware. Yes, the technology could someday translate the whole Wikipedia article for you, but it would likely translate the article the same way for all your classmates as well. And plagiarism detection software is evolving along with the science of machine translation.
As for the thesaurus itself, the technology would be a natural fit for Word, which already has a built-in traditional thesaurus. But the technology could also help Microsoft in another key area: search.
That's because while search engines are good at finding things like names, that have just one form, they have a harder time finding expressions that can be phrased in multiple ways.
That's less of an issue when searching across the whole Web. For example, searching "Who shot Abraham Lincoln?" "Who killed Abraham Lincoln" and "Who assassinated Abraham Lincoln" all direct you to a page with John Wilkes Booth.
However, when it comes to searching smaller universes, such as a company's intranet, that might not be the case.
"You might not find it if the words are different," Brockett said. In such cases, automatically searching using similar phrases might boost the likelihood of finding a result.
This article was first published as a blog post on CNET News.
READ MORE - Microsoft aims to build a better thesaurus
Aiming to build a better thesaurus, the Writing Assistance project within Microsoft's research unit is tapping techniques developed to translate from one language to another.
Although thesauri are good at finding lots and lots of synonyms, they require the user to pick the right one because they aren't very good at understanding the context of what is being said. That's where the experience from doing machine translations comes in.
"We've taken the actual translation tables...and what we've done is we've taken those and said if a word in Chinese maps to two different English words maybe those two words are synonyms with some probability," said Christopher Brockett, a computational linguist and one of the Microsoft researchers leading the project.
The approach has two key benefits over a static thesaurus. First of all, the newer approach can do phrases, as opposed to single words. Also, it can draw on the context in which the phrase is used.
Brockett plans to show off a prototype of the tool at TechFest, Microsoft's annual internal science fair. It's just one of dozens of projects that will be shown as part of an effort to expose Microsoft's business units to the work being done in Microsoft's research labs.
TechFest is sort of like "The Dating Game" for Microsoft's research and product development arms. Research teams at Microsoft set up booths, somewhat like a high-school science fair, while product teams shuffle through looking for something that might give their efforts a leg up on the competition.
For the public, TechFest can also offer a glimpse at future product directions. For example, researcher Andy Wilson showed off a number of surface computing projects in the years leading up to the debut of Microsoft's Surface product.
As is the case with most of the projects, the thesaurus effort is still in its infancy.
"We're still working on the algorithms and how much work we give to the language pairs," Brockett said. "We have to get the quality up. There are usability issues that have to be looked into."
Over time, though, Brockett hopes the technique could be used to effectively translate whole sentences. Microsoft has a demonstration of that up on its Web site, but Brockett acknowledges such a treatment shows both the potential and the current limitations of the technology.
But would-be high-school plagiarists beware. Yes, the technology could someday translate the whole Wikipedia article for you, but it would likely translate the article the same way for all your classmates as well. And plagiarism detection software is evolving along with the science of machine translation.
As for the thesaurus itself, the technology would be a natural fit for Word, which already has a built-in traditional thesaurus. But the technology could also help Microsoft in another key area: search.
That's because while search engines are good at finding things like names, that have just one form, they have a harder time finding expressions that can be phrased in multiple ways.
That's less of an issue when searching across the whole Web. For example, searching "Who shot Abraham Lincoln?" "Who killed Abraham Lincoln" and "Who assassinated Abraham Lincoln" all direct you to a page with John Wilkes Booth.
However, when it comes to searching smaller universes, such as a company's intranet, that might not be the case.
"You might not find it if the words are different," Brockett said. In such cases, automatically searching using similar phrases might boost the likelihood of finding a result.
This article was first published as a blog post on CNET News.
Asia still on lookout for best IT talent
Posted by
Sinlung
Employees once had a pick of the type of jobs they want, but the tide has changed and it is now an employers' market. However, even in the current economic landscape, organizations are finding it challenging to recruit the best talent.
E. Balaji, CEO of India-based recruitment agency Ma Foi Management Consultants, said: "We are seeing many companies use the current situation to hire good talent, as the expectations among candidates are now more manageable than it was a few quarters ago.
"In India, job seekers formerly expected salary hikes of above 35 percent and sometimes even 100 percent when they changed jobs. Now they are comfortable with 10 to 20 percent hikes," Balaji told ZDNet Asia in an e-mail interview.
At the same time, he said, employees in India are now particularly cautious about taking up new assignments--to the extent that several companies are finding it harder than before to recruit the right talent.
Kenneth Hung, IT and telecommunications (IT&T) managing consultant at Hudson Hong Kong, said while employers would strongly question the necessity to fill up vacant positions, companies in the Chinese territory still aim to hire the best talent.
But the most suitable talent may not necessarily want to change jobs in the current economic climate, Hung noted in an e-mail interview.
In Singapore, however, considerations over job stability and the viability of their employers have led talented IT workers to consider changing jobs, for companies that they think will offer better job security.
Yeo Gek Cheng, director of IT&T at Hudson Singapore, said this has increased the supply of talent in a "market where there's less competition for great hires". But while this creates an employer's market in the island-state, the challenge now for the Singapore offices of multinational companies is getting their headcount requisitions approved by corporate headquarters.
"A lot of hiring managers are frustrated at the lack of headcount commitment to Asia, which is seen as the only growth region to help propel revenue generation," Yeo said in an e-mail.
"We have a number of situations where a ready client and a ready candidate are in place, but the headcount approval process is stuck at the [global] corporate level and nothing can move in the meantime," she added.
Furthermore, she noted, the economy faltered "almost overnight", but new talent is not usually created in that short span of time. "[So while] demand for hiring has shrunk significantly, the supply of top talent has not [increased]," she explained.
Thus, in Singapore, the IT talent crunch remains tight in niche or growing technologies, Yeo said. "There is less competition for such talent now, but the competition is alive...[and at the same time], there are lower salary hikes and a more manageable recruiting process is in place."
Hard to find
Balaji noted that it is also proving very difficult to find and attract the best people in India. "They are worth the effort and money as their level of performance can be several multiples of what an average member is capable of," he added.
Yeo noted that retention was one of the key challenges companies previously faced when Singapore's job market was flourishing. The hasty and competitive recruiting process then "did nothing to solve the retention challenge", she said.
It will, therefore, be less of a challenge now--compared to six months ago--to hire workers, and employers can return to interviewing, screening and engaging with candidates in greater detail before finalizing an offer, Yeo said.
"This works for both sides and will help improve retention in the longer term," she added.
Overall, the market for tech jobs in Singapore will continue to provide good hiring opportunities, she said. To attract the talent they need, prospective employers must be more creative, she advised.
According to Balaji, it is not easy to identify "individual brilliance" with conventional methods such as interviews.
"Good candidates need to be attracted through smarter approaches," he said. "Conventional methods of looking at electronic databases...sending mass mail or [being approached] directly, are not going to work."
The preferred way to attract and retain the best people is to give them a meaningful and challenging goal, and offer attractive base pay and variable pay based on performance, Balaji suggested.
Yeo also advised IT professionals to focus on building their skills in areas of demand, instead of staying stagnant in their area of expertise.
READ MORE - Asia still on lookout for best IT talent
E. Balaji, CEO of India-based recruitment agency Ma Foi Management Consultants, said: "We are seeing many companies use the current situation to hire good talent, as the expectations among candidates are now more manageable than it was a few quarters ago.
"In India, job seekers formerly expected salary hikes of above 35 percent and sometimes even 100 percent when they changed jobs. Now they are comfortable with 10 to 20 percent hikes," Balaji told ZDNet Asia in an e-mail interview.
At the same time, he said, employees in India are now particularly cautious about taking up new assignments--to the extent that several companies are finding it harder than before to recruit the right talent.
Kenneth Hung, IT and telecommunications (IT&T) managing consultant at Hudson Hong Kong, said while employers would strongly question the necessity to fill up vacant positions, companies in the Chinese territory still aim to hire the best talent.
But the most suitable talent may not necessarily want to change jobs in the current economic climate, Hung noted in an e-mail interview.
In Singapore, however, considerations over job stability and the viability of their employers have led talented IT workers to consider changing jobs, for companies that they think will offer better job security.
Yeo Gek Cheng, director of IT&T at Hudson Singapore, said this has increased the supply of talent in a "market where there's less competition for great hires". But while this creates an employer's market in the island-state, the challenge now for the Singapore offices of multinational companies is getting their headcount requisitions approved by corporate headquarters.
"A lot of hiring managers are frustrated at the lack of headcount commitment to Asia, which is seen as the only growth region to help propel revenue generation," Yeo said in an e-mail.
"We have a number of situations where a ready client and a ready candidate are in place, but the headcount approval process is stuck at the [global] corporate level and nothing can move in the meantime," she added.
Furthermore, she noted, the economy faltered "almost overnight", but new talent is not usually created in that short span of time. "[So while] demand for hiring has shrunk significantly, the supply of top talent has not [increased]," she explained.
Thus, in Singapore, the IT talent crunch remains tight in niche or growing technologies, Yeo said. "There is less competition for such talent now, but the competition is alive...[and at the same time], there are lower salary hikes and a more manageable recruiting process is in place."
Hard to find
Balaji noted that it is also proving very difficult to find and attract the best people in India. "They are worth the effort and money as their level of performance can be several multiples of what an average member is capable of," he added.
Yeo noted that retention was one of the key challenges companies previously faced when Singapore's job market was flourishing. The hasty and competitive recruiting process then "did nothing to solve the retention challenge", she said.
It will, therefore, be less of a challenge now--compared to six months ago--to hire workers, and employers can return to interviewing, screening and engaging with candidates in greater detail before finalizing an offer, Yeo said.
"This works for both sides and will help improve retention in the longer term," she added.
Overall, the market for tech jobs in Singapore will continue to provide good hiring opportunities, she said. To attract the talent they need, prospective employers must be more creative, she advised.
According to Balaji, it is not easy to identify "individual brilliance" with conventional methods such as interviews.
"Good candidates need to be attracted through smarter approaches," he said. "Conventional methods of looking at electronic databases...sending mass mail or [being approached] directly, are not going to work."
The preferred way to attract and retain the best people is to give them a meaningful and challenging goal, and offer attractive base pay and variable pay based on performance, Balaji suggested.
Yeo also advised IT professionals to focus on building their skills in areas of demand, instead of staying stagnant in their area of expertise.
Outsourcing: Bye bye Chennai, hello Brisbane
Posted by
Sinlung
Forget Chennai and Mumbai--the outsourcing hubs of tomorrow will be in Belfast and Brisbane.
An eclectic mix of 31 cities, in countries ranging from Australia and Ireland to areas in South America and Africa, will challenge today's best-known outsourcing centers in India and China, according to advisers at KPMG.
Faced with overburdened telecoms infrastructures and overstretched labor markets in traditional offshore locations, cities including Brisbane and Belfast are among the alternatives that should be considered by companies, the KPMG report says.
Belfast makes the grade for its strong schools and universities, its young population, high number of IT graduates and cheap operating costs, while Brisbane has a large talent pool, a multilingual workforce and employee costs that are 10 to 15 percent cheaper than other Australian cities.
According to Shamus Rae, advisory partner at KPMG in the United Kingdom, the credit crunch will drive more companies to outsource IT and business processes, which will in turn hasten the search for new cities to host these services.
The report found that the new cities in Asia-Pacific offer lower costs, younger populations and government incentives such as easy work permits, while those in Europe, Middle East and Africa promise robust telecoms and power infrastructure and niche specialisms in fields such as data management.
Meanwhile cities in the Americas can draw on large labor pools, a more mature service offering, proximity to a major client base and multiple language skills.
Size is not a deciding factor among these emerging cities, with the tiny Port Louis in Mauritius with a population of just 130,000 making the KPMG list alongside the metropolis of Buenos Aires, home to almost 13 million people.
A more important factor is the proportion of computer graduates, the number of research and development institutions, the rate of migration to the cities and common languages with their target markets.
Nick Heath of Silicon.com reported from London.
READ MORE - Outsourcing: Bye bye Chennai, hello Brisbane
An eclectic mix of 31 cities, in countries ranging from Australia and Ireland to areas in South America and Africa, will challenge today's best-known outsourcing centers in India and China, according to advisers at KPMG.
Faced with overburdened telecoms infrastructures and overstretched labor markets in traditional offshore locations, cities including Brisbane and Belfast are among the alternatives that should be considered by companies, the KPMG report says.
Belfast makes the grade for its strong schools and universities, its young population, high number of IT graduates and cheap operating costs, while Brisbane has a large talent pool, a multilingual workforce and employee costs that are 10 to 15 percent cheaper than other Australian cities.
According to Shamus Rae, advisory partner at KPMG in the United Kingdom, the credit crunch will drive more companies to outsource IT and business processes, which will in turn hasten the search for new cities to host these services.
The report found that the new cities in Asia-Pacific offer lower costs, younger populations and government incentives such as easy work permits, while those in Europe, Middle East and Africa promise robust telecoms and power infrastructure and niche specialisms in fields such as data management.
Meanwhile cities in the Americas can draw on large labor pools, a more mature service offering, proximity to a major client base and multiple language skills.
Size is not a deciding factor among these emerging cities, with the tiny Port Louis in Mauritius with a population of just 130,000 making the KPMG list alongside the metropolis of Buenos Aires, home to almost 13 million people.
A more important factor is the proportion of computer graduates, the number of research and development institutions, the rate of migration to the cities and common languages with their target markets.
Nick Heath of Silicon.com reported from London.
India's 3G, WiMax rollout may be pushed to 2010
Posted by
Sinlung
INDIA--Although Telecommunication Minister A. Raja this week said it is possible to hold auctions for India's 3G spectrum by Mar. 31, experts doubt the current government will proceed with haste in this realm.
Nupur Singh Andley, senior research analyst for connectivity at Springboard Research, told ZDNet Asia in a phone interview: "It is doubtful that the private players will roll out 3G services within the next eight months. In fact, we may see the roll out of 3G and WiMax only in 2010."
Naveen Mishra, communications research analyst at IDC India, concurred: "We expect the launch of 3G and WiMax services by private operators to get pushed to at least the last quarter of 2009."
While the government were given 3G airwaves to state-owned telecom operators--Bharat Sanchar Nigam Ltd (BSNL) and Mahanagar Telephone Nigam Ltd (MTNL)--the launch of 3G and WiMax services by private players has been delayed.
In January, the Cabinet referred the proposal to auction 3G and BWA (broadband wireless access) spectrum to a group of ministers, thereby delaying the bid indefinitely. This was prompted by differences of opinion among ministries over the base price for the auctions and number of licenses to award.
While the Department of Telecom had set the base price for the pan-India 3G spectrum at US$406.6 million (INR 20.2 billion) and for the BWA spectrum at US$203.3 million (INR 10.1 billion), the finance ministry had demanded the floor price for both be doubled. The issue became further complicated when the Planning Commission, the Department of Industrial Policy and Promotion, and the IT ministry opposed moves to double the base price.
Mishra told ZDNet Asia in an e-mail interview that although leading telecom operators already ran trials of 3G services, they would need about six months from the allotment of the 3G spectrum to offer these services. This time lag is because telecom operators have not placed orders for 3G network equipment, which takes around six months to procure. Given the uncertainties over the auction, operators do not want to take risks and will place orders only after the auction.
Delay can hurt growth
Despite the global recession, the Indian telecom market has been growing at a fast pace of 50 percent per annum. India added 113.26 million new customers in 2008, taking the total number of mobile subscribers to around 350 million by end-December. India also added over 2.3 million broadband connections in 2008, registering a growth of more than 74 percent.
The market growth is happening on the 2G platform, even as 120 countries across the world have already migrated to 3G.
D. K. Ghosh, chairman and managing director of ZTE Telecom India, told ZDNet Asia in an e-mail: "It would be naive to expect the Indian telecom industry to keep following the growth trajectory, despite other industries facing the recession. If the current market scenario continues for another year or more, we might see the negative effects on the telecom industry."
Springboard's Singh Andley said: "Delay in spectrum allocation is demoralizing the sentiment, especially for international investors."
According to Ghosh, both 3G and WiMax technologies have the potential to put a virtual PC with a broadband connection in the hands of over 350 million mobile users in the country.
Industry reports estimate that a 10 percentage rise in mobile usage can lead to 0.5 percent increase in GDP (Gross Domestic Product). "Going by this estimate, the gains each year for India, which is already a trillion-dollar economy, could be as high as US$50 billion," Ghosh said, adding that the rapid adoption of 3G can solve the twin problems of low broadband and PC penetration.
"The delay [in the launch of 3G and WiMax services] has been hindering the growth of not just the telecom industry but also the country," he said.
Mishra concurred, noting that any delay in launching this aspect will result in India lagging behind global markets where people are already consuming such services. As voice tariffs in India are one of the lowest in the world, operators would like to launch value-added services to sustain and improve their current ARPU (average revenue per user) levels, he said.
Competing wireless technologies?
Last month, the WiMax Forum, which certifies and promotes interoperability between broadband wireless products, launched its Global Roaming Program that lets operators and vendors easily obtain information required to establish WiMax roaming services. This has raised questions on whether 3G and WiMax technologies are complementary to each other or will compete over time.
In India, companies such as Tata Communications, Intel, BSNL and Reliance Communications, are proponents of WiMax and most of the companies have already done beta-runs of the technology.
Mishra noted that not all telecom operators in the country will win 3G spectrum licenses, and some of them "will have to bank on bagging BWA spectrum licenses, which will allow them to offer mobile WiMax services". He added: "3G and WiMax would compete with each other."
However, most experts believe WiMax and 3G technologies can coexist.
Raghu Prasad, Oracle's Asia-Pacific and Japan communications senior director of business transformation, said in an e-mail interview: "It is very likely that in a diverse market such as India, both 3G and WiMax will find their pockets of use and influence."
Singh Andley said: "While the advent of 3G will help in the growth of e-medicine, e-government, e-health, m-commerce and e-learning, WiMax will increase broadband connectivity, especially in India's rural areas."
The Telecom Regulatory Authority of India is banking on both 3G and WiMax to increase the country's broadband penetration, setting a target of 20 million broadband connections by 2010, up from the current 4.3 million connections.
Ghosh said: "The industry is expecting both WiMax and 3G to bridge this gap."
Swati Prasad is a freelance IT writer based in India.
READ MORE - India's 3G, WiMax rollout may be pushed to 2010
Nupur Singh Andley, senior research analyst for connectivity at Springboard Research, told ZDNet Asia in a phone interview: "It is doubtful that the private players will roll out 3G services within the next eight months. In fact, we may see the roll out of 3G and WiMax only in 2010."
Naveen Mishra, communications research analyst at IDC India, concurred: "We expect the launch of 3G and WiMax services by private operators to get pushed to at least the last quarter of 2009."
While the government were given 3G airwaves to state-owned telecom operators--Bharat Sanchar Nigam Ltd (BSNL) and Mahanagar Telephone Nigam Ltd (MTNL)--the launch of 3G and WiMax services by private players has been delayed.
In January, the Cabinet referred the proposal to auction 3G and BWA (broadband wireless access) spectrum to a group of ministers, thereby delaying the bid indefinitely. This was prompted by differences of opinion among ministries over the base price for the auctions and number of licenses to award.
While the Department of Telecom had set the base price for the pan-India 3G spectrum at US$406.6 million (INR 20.2 billion) and for the BWA spectrum at US$203.3 million (INR 10.1 billion), the finance ministry had demanded the floor price for both be doubled. The issue became further complicated when the Planning Commission, the Department of Industrial Policy and Promotion, and the IT ministry opposed moves to double the base price.
Mishra told ZDNet Asia in an e-mail interview that although leading telecom operators already ran trials of 3G services, they would need about six months from the allotment of the 3G spectrum to offer these services. This time lag is because telecom operators have not placed orders for 3G network equipment, which takes around six months to procure. Given the uncertainties over the auction, operators do not want to take risks and will place orders only after the auction.
Delay can hurt growth
Despite the global recession, the Indian telecom market has been growing at a fast pace of 50 percent per annum. India added 113.26 million new customers in 2008, taking the total number of mobile subscribers to around 350 million by end-December. India also added over 2.3 million broadband connections in 2008, registering a growth of more than 74 percent.
The market growth is happening on the 2G platform, even as 120 countries across the world have already migrated to 3G.
D. K. Ghosh, chairman and managing director of ZTE Telecom India, told ZDNet Asia in an e-mail: "It would be naive to expect the Indian telecom industry to keep following the growth trajectory, despite other industries facing the recession. If the current market scenario continues for another year or more, we might see the negative effects on the telecom industry."
Springboard's Singh Andley said: "Delay in spectrum allocation is demoralizing the sentiment, especially for international investors."
According to Ghosh, both 3G and WiMax technologies have the potential to put a virtual PC with a broadband connection in the hands of over 350 million mobile users in the country.
Industry reports estimate that a 10 percentage rise in mobile usage can lead to 0.5 percent increase in GDP (Gross Domestic Product). "Going by this estimate, the gains each year for India, which is already a trillion-dollar economy, could be as high as US$50 billion," Ghosh said, adding that the rapid adoption of 3G can solve the twin problems of low broadband and PC penetration.
"The delay [in the launch of 3G and WiMax services] has been hindering the growth of not just the telecom industry but also the country," he said.
Mishra concurred, noting that any delay in launching this aspect will result in India lagging behind global markets where people are already consuming such services. As voice tariffs in India are one of the lowest in the world, operators would like to launch value-added services to sustain and improve their current ARPU (average revenue per user) levels, he said.
Competing wireless technologies?
Last month, the WiMax Forum, which certifies and promotes interoperability between broadband wireless products, launched its Global Roaming Program that lets operators and vendors easily obtain information required to establish WiMax roaming services. This has raised questions on whether 3G and WiMax technologies are complementary to each other or will compete over time.
In India, companies such as Tata Communications, Intel, BSNL and Reliance Communications, are proponents of WiMax and most of the companies have already done beta-runs of the technology.
Mishra noted that not all telecom operators in the country will win 3G spectrum licenses, and some of them "will have to bank on bagging BWA spectrum licenses, which will allow them to offer mobile WiMax services". He added: "3G and WiMax would compete with each other."
However, most experts believe WiMax and 3G technologies can coexist.
Raghu Prasad, Oracle's Asia-Pacific and Japan communications senior director of business transformation, said in an e-mail interview: "It is very likely that in a diverse market such as India, both 3G and WiMax will find their pockets of use and influence."
Singh Andley said: "While the advent of 3G will help in the growth of e-medicine, e-government, e-health, m-commerce and e-learning, WiMax will increase broadband connectivity, especially in India's rural areas."
The Telecom Regulatory Authority of India is banking on both 3G and WiMax to increase the country's broadband penetration, setting a target of 20 million broadband connections by 2010, up from the current 4.3 million connections.
Ghosh said: "The industry is expecting both WiMax and 3G to bridge this gap."
Swati Prasad is a freelance IT writer based in India.
Growth of clinical trial outsourcing raises issues
Posted by
Sinlung
By Tom Watkins
(CNN) -- The practice of moving research involving human subjects from wealthy countries to less wealthy countries has grown in recent years, raising a number of ethical and scientific issues that need to be addressed, researchers said in a journal article Wednesday.More and more clinical trials are being moved to less wealthy nations, a medical journal article reports.
He and his co-authors reported in the New England Journal of Medicine that in November 2007, about one-third of clinical trials (157 of 509) were being carried out entirely outside the United States, many of them in developing countries. Between 1995 and 2005, the number of countries where such trials were being carried out more than doubled, while the number in the United States and Western Europe decreased, the researchers at Duke University said.
The shift appears to have been driven at least in part by economics -- a top medical center in India charges about a tenth what a second-tier U.S. medical center would charge per case report, the authors said.
Another incentive to move such work abroad: other countries' regulatory environments can be less burdensome. The authors reported one study that found only 56 percent of 670 researchers surveyed in developing countries said their work had been reviewed by a local institutional review board or a health ministry.
Another study reported that 18 percent of published trials carried out in China in 2004 adequately discussed informed consent for subjects considering participating in research.
In addition, recruitment of study subjects can be easier in developing countries, where a trial subject may get more than a year's pay to participate or participation could be his or her sole means of being able to get treatment, the authors said.
Transparency is yet another issue.
"We know little about the conduct and quality of research in countries that have relatively little clinical research experience," they wrote.
Schulman put it more bluntly.
"We've seen problems with people cheating on clinical trials," he said.
He acknowledged that similar problems have arisen in the United States, but said such misdeeds were less likely to be found out when they happened abroad.
Of critical importance is the fact that some populations' genetic makeup may affect their response to medication, the authors said. For example, they said, some 40 percent of people of East Asian origin have a genetic trait that impairs ethanol metabolism and limits response to nitroglycerin treatment.
"This finding may affect the relevance of trials involving cardiac, circulatory and neurologic disorders that are treated with nitroglycerin or nitric oxide-dependent therapies," they said.
The authors called for regulations governing trials to be reduced while ensuring ethical conduct, for greater use of centralized oversight boards and for research contracts to be written using standardized terms.
"Key strategies for clinical trials should be outlined in formal clinical-development plans, publicly vetted, and submitted to regulatory agencies," they said.
Alan Goldhammer, vice president for scientific and regulatory affairs at the Pharmaceutical Research and Manufacturers of America, said the industry will study the suggestions and weigh whether to incorporate them.
"We're constantly taking a fresh look at all our documents and revising them as appropriate," he said. "The last thing any company wants to have happen is for a trial site to be called into question and that data then not used for review by the regulatory agency, which could put its approval status in limbo."
The matter has gained in importance with the announcement by the Obama administration that the government will carry out tests to determine which drugs work best.
A spokeswoman for the Food and Drug Administration, who said she could not be identified because she had not sought permission to talk to the news media, said the agency has begun training and educating regulators in countries where clinical trials are being carried out for companies that are seeking U.S. approval for their drugs.
Antitrust Pick Varney Saw Google as Next Microsoft
Posted by
Sinlung
By James Rowley
Christine A. Varney, nominated by President Barack Obama to be the U.S.’s next antitrust chief, has described Google Inc. as a monopolist that will dominate online computing services the way Microsoft Corp. ruled software.
“For me, Microsoft is so last century. They are not the problem,” Varney said at a June 19 panel discussion sponsored by the American Antitrust Institute. The U.S. economy will “continually see a problem -- potentially with Google” because it already “has acquired a monopoly in Internet online advertising,” she said.
While the remarks were made months before Obama picked her to head the Justice Department’s antitrust division, the comments signal her approach to the job if confirmed by the Senate. The Microsoft case, brought in 1998 by the Clinton administration, could have led to the breakup of the software giant and was a landmark in antitrust law.
In her remarks at the American Antitrust Institute, Varney advocated aggressive enforcement of antitrust laws to curb the conduct of individual companies that dominate an industry. She didn’t return a reporter’s telephone call seeking comment today.
White House spokesman Ben LaBolt said that Obama nominated Varney “to vigorously enforce the law” and “is confident that she can do so in a fact-specific and evenhanded way with every matter she will face.”
Lobbying for Netscape
Varney, 53, lobbied the Clinton administration on behalf of Netscape Communications Corp. to urge antitrust enforcers to sue Microsoft. She had previously been a member of the Federal Trade Commission under the administration of President Bill Clinton where she became an advocate of online privacy.
Her comments on Google last year combined praise for the company along with her warnings. The Mountain View, California- based company, owner of the world’s most popular search engine, is a “spectacular” innovator that became the dominant online advertiser through “terrific work,” Varney said.
She also said Google had “lawfully” obtained its monopoly.
Still, Google is “quickly gathering market power in what I would call an online computing environment in the clouds,” she said, using a software industry term for software that is based on the Internet rather than in individual personal computers.
“When all our enterprises move to computing in the clouds and there is a single firm that is offering a comprehensive solution,” Varney said, “you are going to see the same repeat of Microsoft.”
Google ‘Discriminating’
As in the Microsoft case, “there will be companies that will begin to allege that Google is discriminating” against them by “not allowing their products to interoperate with Google’s products,” Varney said.
Google spokesman Adam Kovacevich said in an e-mail response the company has stiff competition that “is literally one click away” on the Internet. Nothing prevents unhappy customers “from switching to another search engine,” he said. “Cloud computing is really in its infancy,” he said. “There’s going to be rich competition in that space for a long time to come.”
A survey by ClickStream Technologies last year said Google Docs, a Web-based application, is only used by 1 percent of Internet users.
Research Report
A May 2008 research report by Merrill Lynch & Co. said Google and Salesforce.com are “leveraged to benefit from cloud computing because their core service delivery is based on the Internet.” They were among 10 companies, including Amazon.com, Sun Microsystems Inc. and Microsoft, that were positioned to capitalize on the shift to Internet-based software. Still, Google’s “revenues from these services are currently less than 1 percent of total revenues,” the report said.
Varney said last June she was “deeply troubled” by Google’s acquisition of DoubleClick Inc., a maker of software for online advertising, and its proposed advertising alliance with Yahoo! Inc.
The DoubleClick purchase was cleared by the Federal Trade Commission in 2007. In November, the Justice Department blocked Google’s proposal to share online advertising with Yahoo by threatening to file a lawsuit challenging the joint venture. Google and Yahoo canceled the deal.
In its statement about the proposed alliance, the Justice Department described Google as “by far the largest provider” of advertising based on Internet searches and syndication of such search ads.
Market Share
The government said Google had a market share of more than 70 percent of both markets. Together, Yahoo and Google would have controlled 90 percent of the Internet search-ad market and 95 percent of the market for search syndication deals.
Varney had invited outside groups like the American Antitrust Institute to help the next administration find ways to enforce the anti-monopolization provision of the Sherman Antitrust Act, known as Section 2, “in a meaningful way in the coming decade given the way the economy is going.”
“Telling a liberal Democrat to go out and enforce Section 2 is a little bit like telling a Catholic ‘do not sin.’ Yeah, we want to do that,” she said.
Because the Bush administration Justice Department hadn’t brought any anti-monopolization cases, it has ceded the field to European authorities, she said. The European Commission has continued to bring complaints about Microsoft’s business practices and investigated Intel Corp. before the FTC began its own inquiry last year.
Global Competition
U.S. companies would be hurt “if we don’t have influence on the development” of global competition policy toward “dominant- firm behavior” because “the Europeans are even much more extreme than I would be,” she said.
Varney described her role representing Netscape as helping “create the political climate” for the government to sue Microsoft.
“When we went after Microsoft,” the company wasn’t “viewed in any way as a drag on innovation” and it was “very, very difficult to get consensus to get the government” to bring a case, she said.
In the Microsoft case, the Justice Department accused the company of illegally defending its operating software market by thwarting distribution of Netscape’s rival Navigator Web browser.
An appeals court upheld findings that Microsoft had abused its monopoly and set aside a judge’s order to break up the company.
The Bush administration negotiated a settlement in 2001 that required Microsoft to give computer makers freedom to promote products that competed with Microsoft’s software offerings, such as Internet Explorer and Windows Media Player.
Google fell $15.02, or 4.2 percent, to close at $342.66 in Nasdaq Stock Market trading.
READ MORE - Antitrust Pick Varney Saw Google as Next Microsoft
Christine A. Varney, nominated by President Barack Obama to be the U.S.’s next antitrust chief, has described Google Inc. as a monopolist that will dominate online computing services the way Microsoft Corp. ruled software.
“For me, Microsoft is so last century. They are not the problem,” Varney said at a June 19 panel discussion sponsored by the American Antitrust Institute. The U.S. economy will “continually see a problem -- potentially with Google” because it already “has acquired a monopoly in Internet online advertising,” she said.
While the remarks were made months before Obama picked her to head the Justice Department’s antitrust division, the comments signal her approach to the job if confirmed by the Senate. The Microsoft case, brought in 1998 by the Clinton administration, could have led to the breakup of the software giant and was a landmark in antitrust law.
In her remarks at the American Antitrust Institute, Varney advocated aggressive enforcement of antitrust laws to curb the conduct of individual companies that dominate an industry. She didn’t return a reporter’s telephone call seeking comment today.
White House spokesman Ben LaBolt said that Obama nominated Varney “to vigorously enforce the law” and “is confident that she can do so in a fact-specific and evenhanded way with every matter she will face.”
Lobbying for Netscape
Varney, 53, lobbied the Clinton administration on behalf of Netscape Communications Corp. to urge antitrust enforcers to sue Microsoft. She had previously been a member of the Federal Trade Commission under the administration of President Bill Clinton where she became an advocate of online privacy.
Her comments on Google last year combined praise for the company along with her warnings. The Mountain View, California- based company, owner of the world’s most popular search engine, is a “spectacular” innovator that became the dominant online advertiser through “terrific work,” Varney said.
She also said Google had “lawfully” obtained its monopoly.
Still, Google is “quickly gathering market power in what I would call an online computing environment in the clouds,” she said, using a software industry term for software that is based on the Internet rather than in individual personal computers.
“When all our enterprises move to computing in the clouds and there is a single firm that is offering a comprehensive solution,” Varney said, “you are going to see the same repeat of Microsoft.”
Google ‘Discriminating’
As in the Microsoft case, “there will be companies that will begin to allege that Google is discriminating” against them by “not allowing their products to interoperate with Google’s products,” Varney said.
Google spokesman Adam Kovacevich said in an e-mail response the company has stiff competition that “is literally one click away” on the Internet. Nothing prevents unhappy customers “from switching to another search engine,” he said. “Cloud computing is really in its infancy,” he said. “There’s going to be rich competition in that space for a long time to come.”
A survey by ClickStream Technologies last year said Google Docs, a Web-based application, is only used by 1 percent of Internet users.
Research Report
A May 2008 research report by Merrill Lynch & Co. said Google and Salesforce.com are “leveraged to benefit from cloud computing because their core service delivery is based on the Internet.” They were among 10 companies, including Amazon.com, Sun Microsystems Inc. and Microsoft, that were positioned to capitalize on the shift to Internet-based software. Still, Google’s “revenues from these services are currently less than 1 percent of total revenues,” the report said.
Varney said last June she was “deeply troubled” by Google’s acquisition of DoubleClick Inc., a maker of software for online advertising, and its proposed advertising alliance with Yahoo! Inc.
The DoubleClick purchase was cleared by the Federal Trade Commission in 2007. In November, the Justice Department blocked Google’s proposal to share online advertising with Yahoo by threatening to file a lawsuit challenging the joint venture. Google and Yahoo canceled the deal.
In its statement about the proposed alliance, the Justice Department described Google as “by far the largest provider” of advertising based on Internet searches and syndication of such search ads.
Market Share
The government said Google had a market share of more than 70 percent of both markets. Together, Yahoo and Google would have controlled 90 percent of the Internet search-ad market and 95 percent of the market for search syndication deals.
Varney had invited outside groups like the American Antitrust Institute to help the next administration find ways to enforce the anti-monopolization provision of the Sherman Antitrust Act, known as Section 2, “in a meaningful way in the coming decade given the way the economy is going.”
“Telling a liberal Democrat to go out and enforce Section 2 is a little bit like telling a Catholic ‘do not sin.’ Yeah, we want to do that,” she said.
Because the Bush administration Justice Department hadn’t brought any anti-monopolization cases, it has ceded the field to European authorities, she said. The European Commission has continued to bring complaints about Microsoft’s business practices and investigated Intel Corp. before the FTC began its own inquiry last year.
Global Competition
U.S. companies would be hurt “if we don’t have influence on the development” of global competition policy toward “dominant- firm behavior” because “the Europeans are even much more extreme than I would be,” she said.
Varney described her role representing Netscape as helping “create the political climate” for the government to sue Microsoft.
“When we went after Microsoft,” the company wasn’t “viewed in any way as a drag on innovation” and it was “very, very difficult to get consensus to get the government” to bring a case, she said.
In the Microsoft case, the Justice Department accused the company of illegally defending its operating software market by thwarting distribution of Netscape’s rival Navigator Web browser.
An appeals court upheld findings that Microsoft had abused its monopoly and set aside a judge’s order to break up the company.
The Bush administration negotiated a settlement in 2001 that required Microsoft to give computer makers freedom to promote products that competed with Microsoft’s software offerings, such as Internet Explorer and Windows Media Player.
Google fell $15.02, or 4.2 percent, to close at $342.66 in Nasdaq Stock Market trading.
Winter Olympics tech on track
Posted by
Sinlung
A year before the Winter Olympics in Vancouver get under way the technological backbone is nearing the finish line.
The 2010 Olympic Games will depend on the 13 computer systems handling everything from streaming results worldwide to workforce management, as well as protecting the games from security breaches and technical faults.
Atos Origin, the company overseeing the installation of the Games' IT, says that systems are in place and are being put through 100,000 hours of testing.
"Like the athletes, this complex network of systems must work perfectly from the first moment, in front of three billion people. There is no second chance," Magnus Alvarsson, Atos Origin's chief integrator for the 2010 Games, said in a statement.
Atos Origin has overseen IT at the Olympic Games since 2002 and is responsible for overseeing the technology behind the London Games in 2012.
The London Games' chief integrator recently told ZDNet Asia's sister site Silicon.com that the IT backbone for the 2012 Games would be trimmed in a drive for "sustainability."
READ MORE - Winter Olympics tech on track
The 2010 Olympic Games will depend on the 13 computer systems handling everything from streaming results worldwide to workforce management, as well as protecting the games from security breaches and technical faults.
Atos Origin, the company overseeing the installation of the Games' IT, says that systems are in place and are being put through 100,000 hours of testing.
"Like the athletes, this complex network of systems must work perfectly from the first moment, in front of three billion people. There is no second chance," Magnus Alvarsson, Atos Origin's chief integrator for the 2010 Games, said in a statement.
Atos Origin has overseen IT at the Olympic Games since 2002 and is responsible for overseeing the technology behind the London Games in 2012.
The London Games' chief integrator recently told ZDNet Asia's sister site Silicon.com that the IT backbone for the 2012 Games would be trimmed in a drive for "sustainability."
Yahoo MyWeb bites the dust
Posted by
Sinlung
Yahoo said last week that it will discontinue its MyWeb service on March 18 and is encouraging people to use other Yahoo bookmarking services.
"As we have continued to innovate with the 2.0 release of Delicious and the upgraded Yahoo Bookmarks, we saw that MyWeb users' needs are being served by our newer products," the company said in a posting on its Yahoo Search Blog.
The company has been de-emphasizing the service since at least October 2006 when it started sending MyWeb users to Delicious and offering them the option of using Yahoo Bookmarks.
Yahoo launched MyWeb in 2005, the same year it bought Delicious.
READ MORE - Yahoo MyWeb bites the dust
"As we have continued to innovate with the 2.0 release of Delicious and the upgraded Yahoo Bookmarks, we saw that MyWeb users' needs are being served by our newer products," the company said in a posting on its Yahoo Search Blog.
The company has been de-emphasizing the service since at least October 2006 when it started sending MyWeb users to Delicious and offering them the option of using Yahoo Bookmarks.
Yahoo launched MyWeb in 2005, the same year it bought Delicious.
Audit: No customer data exposed in Kaspersky breach
Posted by
Sinlung
An independent audit of a data breach at security firm Kaspersky's U.S. Web site has confirmed that no customer data was exposed, Kaspersky said last week.
A Romanian hacker site used a SQL injection and cross-site scripting attack to get access to a database on a Web site of the Moscow-based Kaspersky and publicized the attack last week.
Kaspersky announced early last week that it would hire database security expert David Litchfield to analyze the breach.
In the report, Litchfield concludes that an attacker based in Romania used Google to search for Web servers owned by Kaspersky running applications that may be vulnerable to a SQL injection attack, launched an attack, and attempted to gain access to customer data, but failed.
"This caused a number of other attackers from various locations to probe the site further," the report said. "None of these follow-up attackers accessed any customer data either."
The report was delivered to Kaspersky last week.
The same HackersBlog site also launched subsequent SLQ injection attacks on Web sites of two other security firms, BitDefender and F-Secure.
READ MORE - Audit: No customer data exposed in Kaspersky breach
A Romanian hacker site used a SQL injection and cross-site scripting attack to get access to a database on a Web site of the Moscow-based Kaspersky and publicized the attack last week.
Kaspersky announced early last week that it would hire database security expert David Litchfield to analyze the breach.
In the report, Litchfield concludes that an attacker based in Romania used Google to search for Web servers owned by Kaspersky running applications that may be vulnerable to a SQL injection attack, launched an attack, and attempted to gain access to customer data, but failed.
"This caused a number of other attackers from various locations to probe the site further," the report said. "None of these follow-up attackers accessed any customer data either."
The report was delivered to Kaspersky last week.
The same HackersBlog site also launched subsequent SLQ injection attacks on Web sites of two other security firms, BitDefender and F-Secure.
Facebook hits 175 million user mark
Posted by
Sinlung
By Steven Musil
A little more than a month after announcing it had 150 million active users, Facebook has reached 175 million active users--the statistic the social networking site prefers to use, rather than registered accounts overall.Dave Morin, who runs Facebook's application platform team, announced the milestone Friday evening on his Twitter/FriendFeed. Facebook reached 150 million just more than two months after reaching 120 million and about four months after reaching 100 million.
While Facebook got its start at Harvard University in Cambridge, Mass., in 2004, most of this recent growth is coming from outside the United States.
"This includes people in every continent--even Antarctica," CEO Mark Zuckerberg wrote in a blog post last month. "If Facebook were a country, it would be the eighth most populated in the world, just ahead of Japan, Russia, and Nigeria."
However, as we have pointed out before, server power is expensive, especially overseas. Facebook has raised a ton of venture capital, is reportedly hunting for more, and says it's in good financial shape. That brings backs the question, however, if it's growing faster than it ever expected to.
Microsoft launches Windows Mobile 6.5
Posted by
Sinlung
By David Meyer
Microsoft has revealed Windows Mobile 6.5, along with an application marketplace and Web-based backup and synchronization service for the operating system.The announcements were made by Microsoft chief executive Steve Ballmer on Monday at the Mobile World Congress in Barcelona. He also said that handsets using the Windows Mobile operating system would now be known as Windows phones, to make them "easier for the consumer".
"It's a mouthful to say 'Do you want a Windows Mobile phone?'," Ballmer said. In the future, all Windows phones will be identifiable by having a common button displaying the Windows logo.
Windows Mobile 6.5 is not yet available but was announced as a future update for three new handsets announced at the show--HTC's Touch Diamond 2 and Touch Pro 2, and LG's GM730--which will be shipping with the current version. The new operating system has a revised user interface (UI) that incorporates touch features such as swiping to change application--although many of these new features have already been incorporated by handset makers in existing, customized versions of Windows Mobile.
The new version of the mobile operating system also has a hexagonally partitioned homescreen that presents various icons or widgets in a honeycomb-like matrix.
Also included is a new version of Internet Explorer Mobile, promising a more desktop-like experience. Called version 6, this was promised but not delivered as part of Windows Mobile 6.1, leading manufacturers such as HTC to preinstall rival browser Opera.
In a Q&A session during Microsoft's press conference, the company's senior vice president for mobile communications, Andy Lees, told ZDNet Asia's sister site ZDNet UK that Microsoft only "shipped the improved rendering engine halfway through the life of Windows Mobile 6.1".
"It was a two-phase thing--the rendering engine and now the user interface," Lees said, claiming that the late shipping meant users would see an "even higher experience".
Asked by ZDNet Asia's sister site ZDNet UK how much Opera's widespread installation on Windows phones had hurt Microsoft, Lees said the impact was "negligible".
"It's not the area where I would have aspired to see the first add-ons," Ballmer quipped, adding that he thought it demonstrated Microsoft's "open ecosystem".
The new version of Internet Explorer supports embedded multimedia content through Adobe's Flash Lite, but does not support Silverlight, Microsoft's rival technology to Flash.
"We have some more work to do on Silverlight," Ballmer conceded.
The new backup service, Microsoft MyPhone, will let Windows phone users back up and restore up to 200MB of data using a Web-based repository, adding PC-based Web management of the content.
The Windows Mobile marketplace did not feature heavily in Ballmer and Lees's presentation, although Lees did say that such a concept--already in use by Apple, Google and others with their handsets--"does not make a developer ecosystem" but "helps link some applications to particular customers".
Asian telcos step up 'green' in 2009
Posted by
Sinlung
By Vivian Yeo
Telcos in Asia, as with their counterparts in other regions, will place greater emphasis on green IT this year, as they seek to cut costs and ride out the downturn, according to industry observers.Ovum's senior analyst of telco operations Sally Banks, noted that telcos globally will place emissions reduction as a high priority this year. "As take-up of broadband and mobile services slows and the global economic crisis bites, telcos need to look at opportunities to stay ahead of the competition," she pointed out.
There are also tremendous opportunities for operators to help other businesses reduce their carbon footprint, noted Banks in a brief last month. "Estimates suggest that telecoms operators can achieve a 1- to 2 percent reduction in global carbon emissions by implementing green initiatives within their operations.
"However, the telecommunications industry is expected to enable other businesses to reduce emissions by up to five times this amount, highlighting that telecoms has a major role to play in enabling a green economy," she said, adding that improving brand perception is an additional benefit for telcos to go green.
Banks said operators worldwide have introduced green initiatives, such as the "use of renewable energy sources, to power networks and mobile base stations" as well as using natural resources from sustainable sources. They are also starting to use fresh-air cooling systems instead of air-conditioning in data centers, and converting fleet vehicles to LPG-rather than diesel or petrol-based ones.
Other green priorities include using cutting down on energy use, tapping on energy-efficient technologies, recycling materials from phones, networks and offices, and improving the battery life of mobile phones to limit charging them.
Over in the Asia-Pacific region, Ovum has observed that mobile operators including those in Cambodia, China and Pakistan, have implemented solar powered infrastructure "as a way of not only lessening their impact on the environment but also reducing costs", said Banks.
However, more needs to be done, she pointed out. "Ovum expects a lot more activity by players that have yet to implement a formal green policy as it becomes a corporate requirement to be green and as legislation becomes more stringent in this space," she said in the brief.
According to Matt Walker, principal analyst at Ovum, Japanese telcos and the largest wireless operators have been most active in looking at ways to lower energy costs. Attention on energy management issues, however, differs depending on the type of operator, he pointed out in an e-mail. Wireless network operators, for example, are concerned with energy usage in base stations.
"Because of its vast size, China Mobile has [for example] focused lots of management attention at the energy efficiency of its network equipment," he explained.
S'pore telcos step up green
ZDNet Asia understands from a SingTel spokesperson, the telco will remain focused on energy efficiency and corporate sustainability in 2009.
According to the spokesperson, the company has been engaged in activities relating to the reduction of its carbon footprint and management of energy use and waste. These initiatives are a part of its corporate social responsibility (CSR) program.
In its Australian operations, SingTel introduced a number of energy-saving initiatives, including solar-powered air-conditioning to an office, he noted, adding that its subsidiary Optus achieved 50 percent carbon neutrality at corporate sites "through a combination of green power and carbon offsets". From its corporate office in Singapore, the telco has also raised the air-conditioning temperature, encouraged staff to switch off their lights and introduced an e-billing option.
SingTel in 2008 received the Green Globe Award from the New South Wales Department of Environment and Climate Change for its Macquarie Park facility, which is capable of recycling 92 percent of all waste material exiting the site, the spokesperson added.
Michael Sim, StarHub's senior corporate communications and investor relations manager, told ZDNet Asia in an e-mail interview that the telco is "firmly committed to establishing sustainable green practices this year" across its operations. More details will be made known at a later date, he added.
HP aces data center migration for Britannia
Posted by
Sinlung
Data center migration
Britannia Industries Ltd. has enjoyed a smooth, seamless migration with minimum service outages. Akhtar Pasha reports on the data center migration project
Britannia Industries Ltd , one of India’s leading food companies, recently selected Hewlett-Packard India to implement a comprehensive IT outsourcing and transformation project that will include infrastructure solutions, SAP Application Services, consulting and outsourced services. This makes Britannia the first major FMCG Company to comprehensively outsource its IT operations including the data center (DC). This strategic outsourced partnership at India’s best known food company will help the various growth and business transformation initiatives within Britannia and ensure its long-term competitive position.
As part of the long term partnership, HP will implement tailormade solutions for the company. HP performs the role, as Britannia’s IT partner, of helping the company focus on aligning IT with Business and helping in achieving strategic IT objectives with standardized predictable IT processes and support across all operational locations. Vinita Bali, Managing Director, Britannia Industries said, “We are happy to partner with Hewlett-Packard to drive our business results through IT solutions that are leading-edge and helps us to serve our customers with excellence.” Given the steady growth in the FMCG industry, the IT infrastructure has to be increasingly responsive to keep pace with the company’s accelerated growth. To tackle this, HP has proposed an agile and adaptive infrastructure completely custom-built to Britannia’s specifications. These applications are derived from HP’s knowledge and vast experience in handling large complex DC operations globally. Britannia selected HP as its partner in IT-enabled transformation after an extensive and exhaustive evaluation. of solution providers.
What distinguishes Britannia as a front-runner in terms of IT adoption is its vision for Business Technology and a penchant for innovation. Shyamsunder, Vice President, Quality and IT at Britannia Industries Ltd., said, “The fact that we have made progress in leaps and bounds on the IT transformation front is because we have received the complete buy-in and support of the leadership team. Technology is perceived and accepted not only from the functional sense but as a key business enabler.”
One of the first IT initiatives undertaken at Britannia by HP involves a complex data center migration project.
The migration
This is the story of how HP helped Britannia migrate all of its 35 mission critical servers to a hosted data center 20 km away. The migration was completed in 24 hours flat, with minimum downtime and disruption to business operations.
V.V. Padmanabham, Corporate Manager-IS, Britannia Industries Ltd., emphasizing the mission criticality of the project said, “In our line of work, even a day’s loss of sale is not recoverable retrospectively. This would translate to lowered availability of products to sell to the distributors and would have a cascading effect in terms of actual sales. Keeping the downtime to the bare minimum was then, one of the fundamental premises of any migration exercise undertaken.”
The Britannia and HP teams jointly made a concerted effort to get the planning and project management outline in place, to ensure the success of the project. Good project management methodology entails that objectives are defined and crystal clear, that there is a proper balancing of quality, scope, cost and time, that the plan adopted will meet the expectations of various stakeholders that right resources are identified and finally that the project will be executed as per the plan.
Padmanabham recalled “When we began the project we challenged several of our own assumptions on the IT infrastructure side. Every hour was critical during the data center migration and hence each anticipated activity during the migration was broken into sub-activities. Our single success factor had to be the amazing amount of detailing and the overall efficiency of the planning exercise.”
What this meant for the team internally was that any downtime required for the execution, had be justified to stakeholders and also had to stay within the parameters of their expectations. The only way to achieve this was to demonstrate an enormous level of detailing for each and every activity that had to be performed towards the server migration. Further, each activity level was detailed with a defined owner, escalation levels and communication matrix at each milestone.
The detailing exercise was initiated by first devising an efficient Project Plan. The team pulled together a detailed Work Break-down Structure (WBS) of planned activities that encompassed a resolution of tasks with a duration of within 10 minutes. To gather the details that were woven into the WBS, the team first spent time conducting workshops with all the technical team members involved with managing the IT systems at Britannia. Once the planning exercise was in place, the estimated downtime was discussed with relevant stakeholders and their buy-in was obtained. At the end of the Planning Phase, the team also realized that the two day project (estimated) could be divided into four phases –Back-up, Shutdown, packing and moving, Reinstallation and finally Testing and go live.
By the time the planning exercise was complete, the team had answers to difficult questions including—what if the truck carrying our servers to the new destination meets with an accident? Or what if it starts to rain at the time of the DC migration? Even the packer and mover staff had been briefed on being sensitive to time wastage and handled the backend with ease—dismantling servers, moving them, placing them, drawing cables –the entire exercise was smoothly orchestrated. In great measure, the success of the project was due to the meticulous planning that preceded execution—over 20 working days went into planning the execution.
Challenges and solutions | ||
Business Challenges | Solution | Benefits |
|
|
|
Exceeding expectations
Now that the planning was in place, the next step was to plan the time and allocation of resources to take the DC migration forward. During the resource allocation it was important to keep in mind, that for the DC migration exercise to be successful, it had to be executed continuously. This meant that the IT team available within Britannia had to work continuously for an estimated 48 hours. This would have been practically impossible and the challenge was surmounted by augmenting the internal team with additional resources from HP.
The HP team working with the Britannia DC Migration team underwent a detailed induction well before the execution date to understand the technical details of the Britannia IT system. Apart from technical and operational detailing, the team also facilitated soft factors such as accommodation of the team in proximity to the DC premises. To address the need of internally socializing the project, the team identified a communication manager whose only responsibility was to communicate the completion of mile stones to the senior management at Britannia and the HP team. This person was also assigned the responsibility of bringing to the team’s attention, any major deviations which could require senior management intervention.
The project, most importantly, exemplified team-work at its best. The large and diverse team consisted of IT engineers, facility and external workers including carpenters, electricians, cleaners, packers and movers. There was also the DC hosting service provider teams at the Britannia’s new DC. It took an entire month for the project team to complete all these activities leading up to the actual movement of the DC. Next was to actually execute the game plan.
Armed with a detailed WBS, the team went through the execution of the plan with clockwork precision and completed the process relatively easier than anticipated. In the end, the systems were released to production 1.30 hours prior to the allowed time schedule and deadline. There was absolutely no damage to equipment and zero hardware failures. Padmanabham explained, “There were over 46 people who finally worked on this migration to make it successful and they were from different companies, ranging from project managers to packers and movers. HP took the lead in terms of overall program management and saw the transformation through successfully, despite the challenges presented by the nature of the hybrid environment with multiple stakeholders from different backgrounds.”
Britannia officials are pleased with the pace at which the entire operations was carried out without/minimum disruption to its operations.
Webware Radar: Google Checkout stalls as Bill Me Later soars
Posted by
Sinlung
Marketing agency Rosetta released a study Thursday that found Bill Me Later and PayPal are the most popular alternative payment options on the Web, capturing 26 percent and 25 percent market share, respectively. Google Checkout increased its share by just 1 percent in 2008 commanding just 11 percent of the market.
Rosetta also found that 37 percent of the top 100 major retailers on the Web employ alternative payment options like those offered from PayPal and Bill Me Later, but just 7 percent of those retailers offer all three services.
iPhone developer Smule announced Thursday that it has secured $3.9 million of funding in a round that was led by Granite Ventures. According to the company, its apps have been downloaded by more than 1 million users and due to that success, it was able to raise the capital. It plans to use its funding to further invest in apps for the iPhone and other mobile devices.
Mixx, a Digg-like service "for the mainstream," launched a new homepage Thursday called YourMixx and will allow users to decide whether they want the company's new page to be their start page or their individual notifications page. The company also announced that users who write polls that are selected for publication on the site will be rewarded with "Karma and props." The Mixx user who has the most polls published will be given a Pollster badge to add to their profile page.
Enterprise microblogging service Yammer will announce a hosted version of its software Thursday that can be installed inside a corporate firewall, TechCrunch is reporting. Yammer customers will be able to switch from the SaaS version of the software to the hosted service and it plans to transfer network information between both iterations, the report claims. Yammer plans to charge $12 per seat per year.
Rosetta also found that 37 percent of the top 100 major retailers on the Web employ alternative payment options like those offered from PayPal and Bill Me Later, but just 7 percent of those retailers offer all three services.
iPhone developer Smule announced Thursday that it has secured $3.9 million of funding in a round that was led by Granite Ventures. According to the company, its apps have been downloaded by more than 1 million users and due to that success, it was able to raise the capital. It plans to use its funding to further invest in apps for the iPhone and other mobile devices.
Mixx, a Digg-like service "for the mainstream," launched a new homepage Thursday called YourMixx and will allow users to decide whether they want the company's new page to be their start page or their individual notifications page. The company also announced that users who write polls that are selected for publication on the site will be rewarded with "Karma and props." The Mixx user who has the most polls published will be given a Pollster badge to add to their profile page.
Enterprise microblogging service Yammer will announce a hosted version of its software Thursday that can be installed inside a corporate firewall, TechCrunch is reporting. Yammer customers will be able to switch from the SaaS version of the software to the hosted service and it plans to transfer network information between both iterations, the report claims. Yammer plans to charge $12 per seat per year.
Twitter hit with 'Don't Click' clickjacking attack
Posted by
Sinlung
(Credit: Sunlight Labs)
Tweets began appearing that said "Don't Click" followed by a link. Naturally, people clicked. When they did so, a tweet was sent from their account with the same "Don't Click" message and link.
"We patched the "don't click" clickjacking attack 10 minutes ago. Problem should be gone," John Adams, aka Netik, an operations engineer at Twitter, tweeted around 11 a.m. PST.
The clickjacking appeared to be harmless and just propagated itself, according to a post on the Sunlight Labs blog.
The code "creates an iframe of the page, hides it, and when you click that button and you're logged into Twitter, it makes you post that message (even though you don't see it). There's not a bit of JavaScript involved. The only JavaScript on the page is their Google Analytics code," the Sunlight Labs post says.
Zumbox gives your house an e-mail in-box
Posted by
Sinlung
Zumbox is an interesting e-mail start-up based on the company's capability to create an electronic mailbox for every residential physical address in the United States.
The idea is that companies that send our paper statements--banks, utility companies, and so on--can now send those documents electronically. The benefits include lower environmental impact, security, and archivability of the messages. More importantly, service providers already know their customers' physical addresses. They can start delivering messages to users immediately, instead of trying to gather their customers' e-mail coordinates.
To sign up for the service, consumers go to Zumbox, enter their physical address, and then wait for a physical letter to arrive with their Zumbox PIN. That closes the loop between online user and home address, and is used to unlock their electronic mailbox.
Billing companies don't have to wait for consumers to connect to the service before they start using it. The idea is that they just start sending their electronic print runs of bills and such to Zumbox, which then files messages in mailboxes waiting for consumers to activate their accounts.
Once customers sign into an account, they can then--for each biller sending them statements--optionally turn off the paper delivery they've been getting. Zumbox can alert users' preexisting e-mail accounts when they have new statements ready for them.
The consumer advantage over getting regular e-mail from a biller? It's a central, secure clearinghouse for bills, and it's archived at the Zumbox site. For the biller, the big advantage, as I said, is setup, since they already know their customers' physical addresses.
The service is free for businesses sending account-based mail, like bills and statements, to customers.
So what's the catch? Commercial mail. Zumbox's customers can also send "special offers" (junk mail to you and me) to subscribers, and not just those whose physical addresses they know. They can blanket entire apartment buildings, or select all addresses within a radius around a given point.
The volume of spam should be kept in check by Zumbox's business model. It charges companies 5 cents per piece of junk--sorry, per special offer--delivered. Users can also opt out of receiving the messages per sender (but not overall).
However, if, like many people, you like getting catalogs in the mail, Zumbox's "offers" service could be a real boon: it lets you get a ton more direct mail without having to hassle with the overflow of paper catalogs. It's also, clearly, a very green solution to mail overload.
Zumbox is clever, and, I think, really useful. But it has a real challenge: it's a middleman business that doesn't become truly valuable for for its for endpoint users (senders and consumers) until there's a critical mass of both. That's a tough slog.
Fortunately for Zumbox, the costs for building out this business are reasonable. Zumbox runs on Amazon's EC2, unlike the very ambitious Earth Class Mail, which needed a giant physical facility to intercept and scan postal mail for its users.
The company has raised $4 million in private (non-venture) funding so far, and Ward says his runway is "as long as we need," even though he plans to start stumping for venture funds in the summer. He also says he has several high-profile senders lined up to start using the service. He wouldn't tell me who they are, but says he'll announce them shortly.
The idea is that companies that send our paper statements--banks, utility companies, and so on--can now send those documents electronically. The benefits include lower environmental impact, security, and archivability of the messages. More importantly, service providers already know their customers' physical addresses. They can start delivering messages to users immediately, instead of trying to gather their customers' e-mail coordinates.
To sign up for the service, consumers go to Zumbox, enter their physical address, and then wait for a physical letter to arrive with their Zumbox PIN. That closes the loop between online user and home address, and is used to unlock their electronic mailbox.
Billing companies don't have to wait for consumers to connect to the service before they start using it. The idea is that they just start sending their electronic print runs of bills and such to Zumbox, which then files messages in mailboxes waiting for consumers to activate their accounts.
Once customers sign into an account, they can then--for each biller sending them statements--optionally turn off the paper delivery they've been getting. Zumbox can alert users' preexisting e-mail accounts when they have new statements ready for them.
The consumer advantage over getting regular e-mail from a biller? It's a central, secure clearinghouse for bills, and it's archived at the Zumbox site. For the biller, the big advantage, as I said, is setup, since they already know their customers' physical addresses.
(Credit: Screenshot by Rafe Needleman/CNET Networks)
Zumbox President Glen Ward told me that the service is also secure, to HIPAA and other levels, allowing the safe sending of financial and personal medial information. The service is free for businesses sending account-based mail, like bills and statements, to customers.
So what's the catch? Commercial mail. Zumbox's customers can also send "special offers" (junk mail to you and me) to subscribers, and not just those whose physical addresses they know. They can blanket entire apartment buildings, or select all addresses within a radius around a given point.
The volume of spam should be kept in check by Zumbox's business model. It charges companies 5 cents per piece of junk--sorry, per special offer--delivered. Users can also opt out of receiving the messages per sender (but not overall).
However, if, like many people, you like getting catalogs in the mail, Zumbox's "offers" service could be a real boon: it lets you get a ton more direct mail without having to hassle with the overflow of paper catalogs. It's also, clearly, a very green solution to mail overload.
Zumbox is clever, and, I think, really useful. But it has a real challenge: it's a middleman business that doesn't become truly valuable for for its for endpoint users (senders and consumers) until there's a critical mass of both. That's a tough slog.
Fortunately for Zumbox, the costs for building out this business are reasonable. Zumbox runs on Amazon's EC2, unlike the very ambitious Earth Class Mail, which needed a giant physical facility to intercept and scan postal mail for its users.
The company has raised $4 million in private (non-venture) funding so far, and Ward says his runway is "as long as we need," even though he plans to start stumping for venture funds in the summer. He also says he has several high-profile senders lined up to start using the service. He wouldn't tell me who they are, but says he'll announce them shortly.
Twitter security: There's still a lot of work to do
Posted by
Sinlung
By Don Reisinger
Few people would characterize the popular and influential microblogging service Twitter as "secure." Hack attacks on Twitter, and Twitter users, appear to be increasing (latest: Twitter hit with "Don't Click" clickjacking attack).
There are two potential security issues currently plaguing the popular social network: the popular use of link shorteners like TinyURL that lead users to unknown destinations, and a single login system that some hope will be fixed with the arrival of OAuth.
Don't click on that link!
Whenever I see an interesting tweet followed by a TinyURL link, I click it. I'll admit it. I don't even consider the ramifications of my actions and often, I'm surprised by where I go.
Luckily, Twitter is aware of this issue, and according to its co-founder, Biz Stone, the company is working on ways to make linking safer on the site.
"User security is absolutely a concern and we're working to make the interface safer in that regard," Stone told ZDNet blogger Jennifer Leggio. "We are looking into other ways to display shared links, for example noting whether a link goes to a picture or a video or some other media element. While more a feature, this could help in addressing some of the risk with the URL redirection."
Ginx, a new third-party service (which ironically requires your Twitter login credential to function; see next section), automatically expands shortened URLs before you click on them.
But what about stopping the use of TinyURL, Bit.ly, and other link-shortening services altogether? So far, Twitter has not indicated that it wants to do that and, as some security experts claim, it shouldn't consider that option.
Peter Gregory, a professional security expert and blogger at the Securitas Operandi blog, said he believes TinyURL use "basically comes down to trust: do you trust the source of the link, or is the creator of the link luring you into visiting a malicious Web site that will attempt to implant malware on your computer?"
Both TinyURL and Bit.ly seem poised to answer that call.
Last year, TinyURL introduced a major improvement to the service that anyone using Twitter should use: a preview feature.
TinyURL's preview feature doesn't require registration and instead asks to place a cookie on your machine. Once you surf to the company's preview page, it asks if you want to enable a TinyURL preview. If so, you only need to click the link on the site and from that moment forward, any TinyURL link you click in Twitter or elsewhere across the Web won't immediately send you to the destination site. Instead, you will be redirected to a TinyURL preview page that allows you to examine the link and decide if you want to go to the respective page.
But that's just one security issue Twitter and its users are forced to confront each day.
Open the front door, please
Do you want to update your Twitter stream with audio through Twitsay? How about updating Twitter while the site is down with Twitddict? Want to work with Twihrl, TwitterFeed, TweetDeck, or some other Twitter client? You can, as long as you give those services your Twitter username and password.
When did it become common practice to tell a start-up service you've only known about for 10 minutes the username and password of a service you rely on? Any security expert will tell us we should never give our password out to third parties and yet, if we want to use third-party Twitter tools, we need to do just that.
Last year, a service called GroupTweet, which takes direct messages sent to a user and republishes them as tweets on that respective user's account, was at the center of a controversy when one of its users, Orli Yakuel, had all her direct messages--many of them personal--become public. She claims that it happened because GroupTweet didn't make its operation clear and she didn't realize the service would work that way.
For his part, GroupTweet founder Aaron Forgue said in a statement that he was "100 percent at fault for this fiasco because I did a poor job of explaining the steps one needs to take to use GroupTweet. I sincerely apologize."
Maybe it's true that GroupTweet didn't explain the "steps one needs to take" to use it, but I think there's a better explanation for why that happened: GroupTweet, like dozens of other third-party Twitter tools, takes the user's password and has full access to their account. And when that information is provided, users are basically giving any third-party an open door to do what they wish with their Twitter account.
So far, the effect has been relatively minor and few people have been impacted by offering up their Twitter usernames and passwords to start-ups. But how much longer can we put ourselves at risk before a major outbreak of personal data hits the social network?
OAuth to the rescue?
Twitter developers have plans to protect us. According to the company's developers, they want to use OAuth--an open user-authentication protocol--to act as the middleman between your Twitter account and a third-party application.
If OAuth is implemented on Twitter, whenever you would go to a third-party site like GroupTweet and sign up to use the service, you would tell that third-party tool what your Twitter username is. That tool would then contact Twitter and ask for permission to perform its function on your account. Twitter would then ask you to verify that you wanted the third-party to perform an operation and would request that you input your password to prove it. Once complete, the third party could perform its service and you could have peace of mind knowing that you only doled out your password to Twitter itself.
You'll also be able to have some control of what exactly the third-party app can access from your Twitter account, and you'll be able to disable individual apps' access to your account as you wish.
According to Twitter's senior software engineer, Britt Selvitelle, who engaged in a Google Groups conversation for Firefox developers, Twitter "will be using OAuth as (its) primary form of token auth(orization)" because the system works well.
Alex Payne, Twitter's API lead, told ReadWriteWeb last month that Twitter has an acute understanding of what it needs to do to secure its service and it has a road map in place that's currently on schedule.
"Our launch plan entails a month or two in private beta, a similar amount of time in public beta, and then a final release," Payne told the blog. "After the final release, we'll allow OAuth to coexist with Basic Auth for no less than six months, and hopefully not much longer. OAuth should be the sole supported authentication mechanism for the Twitter API by the end of 2009."
A blog post on Inuda Innovations' Web site was posted Thursday, saying Twitter's OAuth private beta had begun.
If OAuth works for Twitter, as Payne suggests it will, the service's login issue will be eliminated and one important issue facing the company will be handled.
Is Twitter doing enough?
It's evident, based on the company's actions over the past year and with the news of OAuth entering private beta Thursday, that Twitter is focused on making security a key component in its plans going forward. But whether or not the company is doing enough, or fast enough, is up for debate.
Some might say Twitter needs to do more to help its users and ensure that as it becomes more "mainstream," it does everything it can to keep its users safe. But there are others who say Twitter users need to watch out for themselves and be just as savvy using the microblog as they are when trying to remove malware from Windows.
But it's Alex Payne, one of Twitter's most vocal security champions, who earlier this year posted a message on his blog after Twitter accounts were "hacked" earlier this year highlighting Twitter's security flaws.
"Several months after I joined Twitter in early 2007, I suggested to the team that we do a full internal security audit," Payne said in a blog post on his personal site. "Stop all work, context switch to Bad Guy Mode, find issues, fix them. I wish I could say that we've done that audit in its entirety, but the demands of a growing product supported by a tiny team overshadowed its priority.
"Now we're in an unwelcome position that many technical organizations get into: (we are) so far into a big code base that's never seen any substantial periodic audits that the only way to really find all the issues is to bring in some outside help--something I sincerely hope we end up doing, but is not my call," he continued. "Ultimately, outside security audits are the price a company pays for not building security mindfulness and education into day-to-day development."
Twitter did not respond to requests for comment.
There are two potential security issues currently plaguing the popular social network: the popular use of link shorteners like TinyURL that lead users to unknown destinations, and a single login system that some hope will be fixed with the arrival of OAuth.
Don't click on that link!
Whenever I see an interesting tweet followed by a TinyURL link, I click it. I'll admit it. I don't even consider the ramifications of my actions and often, I'm surprised by where I go.
(Credit: Don Reisinger/CBS Interactive)
But I don't think I'm alone. TinyURL is the most common link you'll see on Twitter, but it's also one of the easiest ways for a malicious user to expose you to issues ranging from phishing scams to malware installs.Luckily, Twitter is aware of this issue, and according to its co-founder, Biz Stone, the company is working on ways to make linking safer on the site.
"User security is absolutely a concern and we're working to make the interface safer in that regard," Stone told ZDNet blogger Jennifer Leggio. "We are looking into other ways to display shared links, for example noting whether a link goes to a picture or a video or some other media element. While more a feature, this could help in addressing some of the risk with the URL redirection."
Ginx, a new third-party service (which ironically requires your Twitter login credential to function; see next section), automatically expands shortened URLs before you click on them.
But what about stopping the use of TinyURL, Bit.ly, and other link-shortening services altogether? So far, Twitter has not indicated that it wants to do that and, as some security experts claim, it shouldn't consider that option.
Peter Gregory, a professional security expert and blogger at the Securitas Operandi blog, said he believes TinyURL use "basically comes down to trust: do you trust the source of the link, or is the creator of the link luring you into visiting a malicious Web site that will attempt to implant malware on your computer?"
Both TinyURL and Bit.ly seem poised to answer that call.
Last year, TinyURL introduced a major improvement to the service that anyone using Twitter should use: a preview feature.
TinyURL's preview feature doesn't require registration and instead asks to place a cookie on your machine. Once you surf to the company's preview page, it asks if you want to enable a TinyURL preview. If so, you only need to click the link on the site and from that moment forward, any TinyURL link you click in Twitter or elsewhere across the Web won't immediately send you to the destination site. Instead, you will be redirected to a TinyURL preview page that allows you to examine the link and decide if you want to go to the respective page.
(Credit: TinyURL)
Bit.ly, another URL-shortening service, provides a Firefox plug-in that allows you to preview links. With both solutions running, the risk of being redirected to a malicious site should be cut down considerably, though not eliminated--nothing in link security is a sure thing.But that's just one security issue Twitter and its users are forced to confront each day.
Open the front door, please
Do you want to update your Twitter stream with audio through Twitsay? How about updating Twitter while the site is down with Twitddict? Want to work with Twihrl, TwitterFeed, TweetDeck, or some other Twitter client? You can, as long as you give those services your Twitter username and password.
When did it become common practice to tell a start-up service you've only known about for 10 minutes the username and password of a service you rely on? Any security expert will tell us we should never give our password out to third parties and yet, if we want to use third-party Twitter tools, we need to do just that.
Last year, a service called GroupTweet, which takes direct messages sent to a user and republishes them as tweets on that respective user's account, was at the center of a controversy when one of its users, Orli Yakuel, had all her direct messages--many of them personal--become public. She claims that it happened because GroupTweet didn't make its operation clear and she didn't realize the service would work that way.
For his part, GroupTweet founder Aaron Forgue said in a statement that he was "100 percent at fault for this fiasco because I did a poor job of explaining the steps one needs to take to use GroupTweet. I sincerely apologize."
Maybe it's true that GroupTweet didn't explain the "steps one needs to take" to use it, but I think there's a better explanation for why that happened: GroupTweet, like dozens of other third-party Twitter tools, takes the user's password and has full access to their account. And when that information is provided, users are basically giving any third-party an open door to do what they wish with their Twitter account.
So far, the effect has been relatively minor and few people have been impacted by offering up their Twitter usernames and passwords to start-ups. But how much longer can we put ourselves at risk before a major outbreak of personal data hits the social network?
OAuth to the rescue?
Twitter developers have plans to protect us. According to the company's developers, they want to use OAuth--an open user-authentication protocol--to act as the middleman between your Twitter account and a third-party application.
If OAuth is implemented on Twitter, whenever you would go to a third-party site like GroupTweet and sign up to use the service, you would tell that third-party tool what your Twitter username is. That tool would then contact Twitter and ask for permission to perform its function on your account. Twitter would then ask you to verify that you wanted the third-party to perform an operation and would request that you input your password to prove it. Once complete, the third party could perform its service and you could have peace of mind knowing that you only doled out your password to Twitter itself.
You'll also be able to have some control of what exactly the third-party app can access from your Twitter account, and you'll be able to disable individual apps' access to your account as you wish.
According to Twitter's senior software engineer, Britt Selvitelle, who engaged in a Google Groups conversation for Firefox developers, Twitter "will be using OAuth as (its) primary form of token auth(orization)" because the system works well.
Alex Payne, Twitter's API lead, told ReadWriteWeb last month that Twitter has an acute understanding of what it needs to do to secure its service and it has a road map in place that's currently on schedule.
"Our launch plan entails a month or two in private beta, a similar amount of time in public beta, and then a final release," Payne told the blog. "After the final release, we'll allow OAuth to coexist with Basic Auth for no less than six months, and hopefully not much longer. OAuth should be the sole supported authentication mechanism for the Twitter API by the end of 2009."
A blog post on Inuda Innovations' Web site was posted Thursday, saying Twitter's OAuth private beta had begun.
If OAuth works for Twitter, as Payne suggests it will, the service's login issue will be eliminated and one important issue facing the company will be handled.
Is Twitter doing enough?
It's evident, based on the company's actions over the past year and with the news of OAuth entering private beta Thursday, that Twitter is focused on making security a key component in its plans going forward. But whether or not the company is doing enough, or fast enough, is up for debate.
Some might say Twitter needs to do more to help its users and ensure that as it becomes more "mainstream," it does everything it can to keep its users safe. But there are others who say Twitter users need to watch out for themselves and be just as savvy using the microblog as they are when trying to remove malware from Windows.
But it's Alex Payne, one of Twitter's most vocal security champions, who earlier this year posted a message on his blog after Twitter accounts were "hacked" earlier this year highlighting Twitter's security flaws.
"Several months after I joined Twitter in early 2007, I suggested to the team that we do a full internal security audit," Payne said in a blog post on his personal site. "Stop all work, context switch to Bad Guy Mode, find issues, fix them. I wish I could say that we've done that audit in its entirety, but the demands of a growing product supported by a tiny team overshadowed its priority.
"Now we're in an unwelcome position that many technical organizations get into: (we are) so far into a big code base that's never seen any substantial periodic audits that the only way to really find all the issues is to bring in some outside help--something I sincerely hope we end up doing, but is not my call," he continued. "Ultimately, outside security audits are the price a company pays for not building security mindfulness and education into day-to-day development."
Twitter did not respond to requests for comment.
Subscribe to:
Posts (Atom)