Holistic management: The Seven S model

The Seven S guides managers to improve their strategic approach to the business. Get an overview of the framework and tips on how to apply it.

A common adage in the management consulting business is that efficiency and effectiveness are completely different measurements.
An organization can be extremely efficient, getting high productivity from their workforce and producing their product or service with very little waste or churn, yet be totally ineffective in meeting their objectives if, for instance, their product or service is not accepted in the marketplace.
This difference is often distilled to the statement "efficiency is about doing things right, while effectiveness is about doing the right things."
In an earlier column on Six Sigma, I noted that Six Sigma is primarily focused on improving quality in areas such as manufacturing, sales, and customer service; in other words, on doing things right. It's not a strategic methodology, so it's not equipped to guide managers to examine their overall business model or strategy.
So how do consultants or managers step up a level from process to strategy?
Understanding the Seven S framework’s basics
The Seven S approach is a framework that focuses on guiding managers to improving, not just our processes, but our entire strategic approach to the business. The model was originally proposed by Richard Tanner Pascale and Anthony Athos in their book The Art of Japanese Management; McKinsey and Company has adopted the model as the basis of its strategic consulting approach.
Key to the conceptual foundation of this approach is the premise that the enterprise is only effective and competitive when certain elements are optimized. This approach is holistic in the sense that it proposes that the firm must refine all of these elements and bring them into harmony in order to achieve its highest level of effectiveness.
So what are the Seven Ss, and how do they fit together to help consultants and managers improve business performance? Here's a brief walk-through of the attributes of the Seven Ss.
#1. Strategy: The overriding goal or objective that the enterprise wishes to achieve, and the course of action it intends to take to reach that goal. From the viewpoint of IT, the key question here is often about alignment. Are the activities of the IT staff focused on achieving the strategic goals of the organization? Is there a forward-looking IT plan or roadmap that illustrates how the IT function will drive towards to long-term strategic objectives of the firm? Is the CIO involved in strategy formulation or just an implementer?
Every IT professional has experienced situations in which a manager or executive becomes enamored of some technical solution, often sold to her by a sales representative as the "end-all fix", and IT finds itself devoting all its energies to implementing a product that is disconnected from the firm's strategic goals.
#2. Structure: The manner in which the enterprise is organized, and the relationships between the entities, such as departments, field offices, etc. Is the organization authoritarian, like the military, or decentralized or federated? How do internal processes and human resources work together to achieve the goals?
In my consulting experience, I've seen many firms that want to migrate to an e-commerce approach to sales, and yet see e-commerce enablement as a project, rather than as a structural problem that needs to be solved. No matter how great the e-commerce engine an organization builds, if it's internal organization and structure is not modified to adapt to this new channel, it has very little chance of success.
#3. Systems: Not just information systems and infrastructure, but also the processes and the functions that enable the organization to work, such as recruiting, accounting, and procurement.
From e-commerce to data warehousing and knowledge management, and all across the array of processes and systems that companies employ to deliver their products and services, the ability to make the right technology decisions, to optimize processes, and to enhance productivity are make-or-break elements of success.
#4. Staff: The human resources that actually accomplish the work, and the recruiting, incentives, and compensation practices that encourage them to achieve. An organization's ability to attract and retain the best talents and to keep them motivated and productive is key to execution of the enterprises goals. All the strategic innovation in the world cannot compensate for an unmotivated staff or low productivity.
#5. Style: The elusive "corporate culture" is captured here; is the enterprise customer focused and quality driven or focused on maximizing profitability at any cost? Does the enterprise strive to build a cohesive team of its staff, or does the organization view its workforce as a series of interchangeable hands-for-hire?
#6. Skills: The unique competencies that drive competitive advantage. From the "hard" technical skills of designing products and managing projects to the "soft" skills of communication and teamwork, staff capabilities are essential elements of strategic success.
This element also addresses organizational skills: As we've recently learned in the case of General Motors, the ability of an organization to develop products or services that the marketplace values is the differentiating factor in the market battlefield.
#7. Shared Values: The core beliefs and attitudes that drive the enterprise. Values are not the mission of the company--that should be captured in the firm's strategy. Values are about behaviors, taking the form of statements like "we'll never sacrifice customer satisfaction for short term profit" or "we always thank the customer for choosing us".
Applying the Seven S framework
Now that we've outlined the elements of the Seven S framework, the obvious question is: How can I apply this framework in our organization?
As a consultant, I'll start a performance improvement engagement by educating my client on the elements of the framework. Many organizations grow organically and don't think about their activities in this structured and methodical way. By simply exposing organizations to this sort of approach, you can start to ignite new ways of thinking about their strategic development process.
By using this framework to methodically analyze the current state of each of these elements, we can get a holistic view of the enterprise and begin to develop a gap analysis that can guide an improvement plan.
Some firms are very strong in some areas, such as staffing and skills, but lack a common set of shared values and a coherent strategy-development function. Through interviews, observation, and facilitated work sessions, you can pinpoint improvement areas and then prescribe a plan for optimizing those functions.
Seven S is just a conceptual framework; therefore, it doesn't tell us how to fix those areas that require development. By applying your experience, reviewing the ideas found in the literature (such as Good to Great and other business classics), enlisting the insights and suggestions of members of the organization, and applying disciplines like Six Sigma where appropriate, you can help firms apply a consistent approach to strategy development and execution and improve their results and competitive position.
READ MORE - Holistic management: The Seven S model

Will Intel and USB make fiber optics mainstream?

You've probably heard about fiber optics for years--some kind of exotic technology used to carry gargantuan quantities of data across continents. But in the not-too-distant future, you might be plugging these tiny glass strands straight into your computer.
That's if Intel gets its way. At its Intel Developer Forum last week, the chipmaker demonstrated fiber-optic technology called Light Peak for connecting many devices to PCs with fiber optic lines. Intel secured major Light Peak endorsement from Sony and now it's has begun trying to make it into an industry standard.
But bringing optical technology to the masses will require more than Intel Chief Technology Officer Justin Rattner taking the stage to connect a thin white Light Peak cable into the back of a prototype PC. According to sources familiar with the situation, the most likely mechanism to carry Light Peak out of the R&D lab to the edge of your laptop will be the venerable Universal Serial Bus, and Intel has begun pounding the pavement to try to make that happen.
"Now all the pieces are in place," Rattner said. "We need to get a standard established to turn on the entire ecosystem to Light Peak."
Even technophobes are familiar with USB. The plug-and-play technology started its journey in PCs and has spread to handsets, consumer electronics devices, digital cameras, and more. And new developments from the group behind the standard, the USB Implementers Forum, could expand adoption more, with a new faster, more power-efficient version and with technology to make it better for charging devices plugged into a computer or power outlet.
The new "SuperSpeed" USB 3.0 has 5 gigabit-per-second data transfer rate, more than 10 times that of the USB 2.0 version that prevails today, and the first USB 3.0 device achieved certification last week. A separate new USB feature increases the amount of power that USB devices can use from 0.5 amps to 0.9 amps while adding another 1.5 amps specifically for charging batteries, making USB for tasks besides just transferring data.
The 5Gbps speed is a big step up; NEC's demonstration of its newly certified USB 3.0 controller showed 500MB of data transferred in 4.4 seconds with USB 3.0 compared to 39 seconds with USB 2.0. But for USB to really break out--to accommodate the data transfer needs of a large 3D TV screen, for example, or to synchronize a terabyte-capacity iPod in moments--there's still more work to be done.
Enter fiber optics.
"At some point the industry is going to have to transition," Jeff Ravencraft, the USB-IF's president and chairman, said in an interview, because copper wires such as those in the current USB 2 and new USB 3 standards have limits on how fast they can transmit signals. "I think the next transition is going to be to optics."
Intel's aspirations and allies
Intel's hope for Light Peak is to create a single connection for video, storage devices, the network, printers, Webcams, and anything else that plugs into a PC. Light Peak uses circuitry that can juggle multiple communication protocols at the same time, and the Light Peak promise is for a universal connector to replace today's incompatible sockets for USB, FireWire, DVI, DisplayPort, and HDMI. It's a hot-plug technology, meaning that devices can be linked when they're up and running.
Intel has pre-production chips and said the technology will be ready to ship in 2010. In its current form, Light Peak can transfer data at 10Gbps each direction along the fiber optic line, but Intel said Light Peak will reach much higher speeds--100Gbps in the next decade, according to Jason Ziller, director of Intel's optical input-output program office.
The Sony endorsement is important, because the company sells PCs, music players, cameras, video cameras, and Blu-ray players. But another company at least as significant had a quieter Light Peak appearance at the Intel show: Apple.
Intel's second demonstration of Light Peak, in which a single cable transported high-definition video and data to a storage system at the same time, used a Mac OS X computer. Apple would be a strong ally: it has influential designs with an emphasis on uncluttered appearance and ease of use, it's willing to take a stand for technology it believes is superior, and its iPhones and iPods that take ever longer to synchronize with a PC as storage capacity expands.
And on Saturday, Engadget reported that Apple isn't merely a Light Peak ally, but that it brought the Light Peak idea to Intel and has plans to bring it to Macs next year. Apple and Intel declined to comment on the matter.
But do we really need to go all the way to optical now? High-speed electrical communications is hard--wires can cause electromagnetic interference, for example, and USB 3 cables can only be 3 meters long compared to 5 meters for USB 2. But technology for transferring data over copper wires, like technology for shrinking computer chips, has defied predictions that it will run out of gas.
The short answer is there is a need. Video screens are getting larger, expanding beyond HD TV's 1920x1080 pixels, and 3D video requires a doubled data transfer rate. Richard Doherty of the Envisioneering Group expects even the newer DisplayPort video standard has only about 24 to 30 months before new technology needs more capacity than it can supply.
"Optical may be the only way to do it," he said, saying the need for 60Gbps transfer rates is on the horizon.
USB group to standardize Light Peak?
Ravencraft would not comment on whether the USB group is working with Intel on adopting Light Peak for the coming transition to optical communications, but there are indications that could happen.
For one thing, the USB 3.0 specification explicitly accommodates optical lines in the cable's connector, a move to try to "future-proof" the standard. For another, when Intel demonstrated Light Peak, it used USB connectors on its prototypes. Ziller said in an interview that nothing should be read into that choice, but it was conspicuous nonetheless.
Light Peak discussions are under way at the USB group, said Steve Roux, senior director of business development at NEC and a member of the USB Implementers Forum board.
"Through the USB-IF we're looking at it. It's clearly something we'll have to pay attention to," Roux said, adding, "We don't see it as a USB 3.0 killer."
The politics of standardization are another reason the USB-IF makes sense for Light Peak. Along with leading developers such as Intel, Hewlett-Packard, NEC, Texas Instruments, ST-Ericsson, and Microsoft, there are more than 200 companies involved in USB development. The USB-IF devotes two whole pages crammed with corporate logos in its presentation to illustrate who widespread USB buy-in is. And to meet Intel's ambition, Light Peak will need to win over the video community as well as those who presently use USB.
There are other groups where standards are set--the Institute of Electrical and Electronics Engineers, or IEEE, for example, which oversees the USB competitor FireWire as well as 802.11 for wireless networks and 802.16 for Ethernet networks.
Doherty believes both are possible. The USB group could get a connector defined rapidly for consumer use, and the IEEE could work on a variation for higher-end systems such as servers, with optical lines linking processors together and linking computers to storage systems.
"If this starts out as a 100-gigabit USB 3F (F for fiber) connector, there's nothing precluding it from going to IEEE and becoming a 10-terabit link with the same connector," Doherty said.
The money question
There's another obstacle besides politics that Intel and any allies must reckon with: cost.
Optical networking, in which lasers send information as photons down transparent fibers, doesn't come cheap.
Doherty believes high-volume production could lower its costs, though. And USB is nothing if not high-volume: about 3 billion USB devices ship a year right now, according to In-Stat.
"There's every indication that if they get down to USB economics, they can get the cost of the connector down to tens of cents instead of the tens of dollars of most high-performance fiber optic connectors now," Doherty said.
One way to cut costs would be to use plastic fibers rather than higher-quality glass, Doherty said. That limits data-transfer capacity compared to glass, but plastic is cheaper and also is more flexible, Doherty said. Here's a sign Intel agrees: Ziller said of Light Peak, "You can tie a knot in it and it'll still work."
Intel has conducted plenty of research into silicon photonics, in which lasers are built into processors themselves, but Light Peak uses more conventional technology for the optical modules that convert ones and zeros into light at and vice versa. Ziller said Intel is using optical modules from mainstream manufacturers such as Avago Technologies, SAE Magnetics, and Foxconn.
Linking two wires is well understood, but how exactly does that work with two fiber optic lines? In Rattner's demonstration featured a hot-plugged Light Peak cable, so evidently Intel has an idea how to make it work economically.
High-end fiber connections are made by fusing the optical lines, but Doherty believes a gel-like adhesive, perhaps protected by a sheath that snaps back when the connector is plugged in, could be used. "It may not be for things you take on and off a hundred times a day," he said, but such a connector could be used dozens or hundreds of times.
Plenty of Intel ideas have flopped, but the company does have more experience than most introducing complicated technology. And it's not putting on the hard sell for Light Peak.
"We're talking hundreds of millions of ports over next the few years, which really will help drive the costs down and make it an attractive technology," Rattner said. "Fundamentally, we believe the time has come for the optical technologies to go high volume."
READ MORE - Will Intel and USB make fiber optics mainstream?

Could Microsoft fix Windows Mobile by buying Palm?

Consolidation is coming to the smartphone market. It’s simply a matter of when and how.
There are six big platforms vying for mainstream acceptance, and the market is likely to start weeding that number down to three to four over the next several years as all mobile phones become smartphones and as smartphones start replacing PCs for some users.
The platforms in the strongest position are the Apple iPhone and RIM’s BlackBerry. The platforms that have some momentum but are still vulnerable are Google Android and Palm WebOS. The platforms that are most at risk and are struggling the most technologically are Nokia's Symbian and Microsoft's Windows Mobile.
The first major consolidation move could involve Palm. The company has been rumored as a buyout target for years. However, after struggling to survive while rebuilding its platform under the leadership of former Apple executive Jon Rubinstein, Palm has had a big year in 2008 with the arrival of its new WebOS and the launch of its first WebOS device, the Palm Pre.
Despite the fact that the Pre and the WebOS have been warmly received by users and the press, Palm still faces challenges. In June, the Pre was launched exclusively with Sprint, the weakest of the U.S. carriers and an acquisition target itself. While Palm aggressively marketed the Pre with its modest resources, Sprint has not been nearly as aggressive.
READ MORE - Could Microsoft fix Windows Mobile by buying Palm?

Singapore firm offers street-level maps, too

Singaporeans will benefit from Google's Street View service, says local mapping startup Gothere.sg, which intends to incorporate the tool into its own maps once the Street View makes its debut in the country.
Ang Junhan, products director at Gothere.sg, told ZDNet Asia in an e-mail interview the online mapping company has plans to make the feature--or a similar one--available on its own maps, which are based on Google's APIs (application programming interfaces).
The local site has its own repository of panoramic street-level pictures that cover some 70 percent of the island and are accessible by clicking on the camera icon, located on the company's digital maps where the function is available on the company's maps.
Ang said these pictures were not taken with the intention of replicating Google's Street View service but rather, were captured during Gothere.sg's mapping process to help plot a directory of Singapore. These photos were taken to mark out amenities such as carpark entrances and other landmarks.
Gothere.sg's business development director, Toh Kian Khai, said the pictures were taken late last year but the team decided to stitch and release the images on the site after realizing "users would love the feature".
Taken over the span of about half a year, the photos added up to "tens of terabytes worth of images", which took an average of 24 hours of processing time for each day of shooting, Toh explained.
The speed of processing was limited by the number of computers the company had, he said.
Toh said no images were captured for the remaining 30 percent of Singapore not covered by the company's street-level viewing feature, primarily in the northern part of the island, and there are presently no plans to continue its street-shooting efforts.
Developing the feature has taken lower priority to other aspects of its online map, he said, noting that Gothere.sg is currently focusing its efforts on developing the site's mobile accessibility.
However, the debut of Google's Street View is unlikely to cannibalize Gothere.sg's own street-level viewing feature. The local startup will not remove its own pictures from the site when Street Level is eventually launched here, Ang said, adding that it will instead meld pictures from both companies.
"When Google's Street View goes live in Singapore, users will have more choices," he said.
Google has not announced any details or timeframe on when Street View will be ready for launch here.
Privacy concerns
News of Street View vehicles making their way to Singapore's roads first broke late last year when Google said it had sent camera-mounted vans, taking pictures, around Singapore.
The search company said it sought clearance from local authorities and would put the pictures through its censoring process, which typically involves blurring out faces and car license plates.
ZDNet Asia observed that such details were not censored or smudged on Gothere.sg's pictures.
When asked, the team said it had not been contacted by government agencies regarding this issue, but noted: "We made it a point to take street-level images only on public roads and were especially careful not to venture into private property."
Google's Street View service has taken some heat over privacy concerns since the service debuted in 2007. Last year, a group of privacy advocates in Japan lobbied for the service to be shut down in the country. Also in Pittsburgh, United States, a couple filed suit against the search giant alleging Google had trespassed on private property.
READ MORE - Singapore firm offers street-level maps, too

Zoho: Cloud customization cheaper than on-premise

Customization on cloud-based deployments can be cheaper than on-premise equivalents if companies choose the right components, according to Raju Vegesna, evangelist for Web productivity suite at Zoho.
Choosing a cloud deployment that relies on open APIs can help push down customization cost because it allows users to choose from a wider variety of third-party software development firms to implement software tweaks, Vegesna told ZDNet Asia.
Companies should further evaluate the ease of customization afforded by a service provider before signing on, he said.
"Some applications like CRM (customer relationship management) might need customization," he noted, adding that deploying apps that support open APIs would facilitate easier integration with other apps, compared with on-premise apps. "If companies want to avoid customization, they absolutely can."
According to Vegesna, 95 percent of Zoho's CRM customers use the product out-of-the-box.
Open source software company, Red Hat, recently warned of cloud "lock-in", which it said could drive up costs when companies want to switch service providers.
Various industry players have come up with several "standards" in a touted move to combat this lock-in. The APS (application packaging standard) model, Red Hat's Deltacloud project and the Open Cloud Standards Incubator (OCSI) group are some examples of efforts undertaken to facilitate interoperability in the cloud.
User licenses the main cost
Vegesna said one of the bigger expenses from cloud deployments typically encompasses user license, since the burden of infrastructure costs is taken out of the equation.
But, high user licenses can still turn some companies off. Singapore-based Digital Scanning told ZDNet Asia in an interview, the company discontinued its Salesforce.com CRM deployment because of the cost of user licenses and training time required.
It switched to smaller player, Zoho, deploying the vendor's SaaS CRM product for its 30 staff members, slashing its license fees by half, said a spokesperson. Extra features on Zoho such as shared calendaring and an e-mail marketing feature were additional cost-savers, she noted, adding that the company plans to stay on the cloud even as staff numbers grow.
Salesforce said in September it has 55 percent share of the global hosted CRM market.
A Datacraft Asia executive said in May this year that the company busted its initial budget allocated to fund its Salesforce.com deployment, because it had under-provisioned the amount needed to acquire user licenses. "Salesforce.com licenses aren't necessarily cheap," he said, advising fellow users to budget sufficiently for their cloud projects.
According to earlier reports, large companies that have adopted cloud computing may face escalating costs, for example, from growing bandwidth charges. A McKinsey report released earlier this year also recommended companies avoid SaaS, saying it was a more costly option for large enterprises, compared to maintaining their own data centers.
READ MORE - Zoho: Cloud customization cheaper than on-premise

Japanese researchers downplay super CPU effort

A group of Japanese researchers are collaborating on a software standard for multicore processors to be used in a range of technology products, including mobile phones and in-vehicle navigation systems. The effort could lead to the development of a super CPU, according to the researchers.


First reported in Japanese publication Nikkei Business News early this month, the initiative involves local chipmakers and IT companies including Canon, Fujitsu, Hitachi, NEC, Panasonic and Renasas, a joint venture between Hitachi and Mitsubishi. It is supported by the Japanese government, which will be providing an initial capital of between 3 billion and 4 billion yen (US$33.3 million to US$44.4 million).
Hironori Kasahara, professor at Waseda University's department of computer science and lead designer for the project, told ZDNet Asia in an e-mail interview that the project has been approved, by the New Energy and Industrial Technology Development Organization (Nedo), to begin a one-year basic planning phase. Nedo is a funding agency under Japan's Ministry of Economy, Trade and Industry.
The grant also includes a three-year detailed research proposal, which involves research in multicore processor architectures, the development of a "parallelizing compiler with power reduction capabilities" and work on the API (application programming interface), Kasahara said.
"We are developing software de facto standard or API for multicore processors for consumer electronics and real-time embedded systems, [such as those used in] automobiles," he explained. "Our targets are for consumer electronics and real-time embedded systems like cell phones, digital television sets, car navigation systems, robotics and automobiles."
Contrary to earlier reports that suggest the super CPU project would challenge Intel, Kasahara noted that the targeted application areas were different from the American chipmaker's processors. In addition, there are no hardware standards involved, he pointed out.
When contacted, a Japan-based NEC spokesperson confirmed the company is participating in an API standardization project "using energy-saving software invented by Waseda University's professor Kasahara".
"NEC is simply aiming to advance unified standards among software technologies," the spokesperson added. "NEC is not, however, planning to develop a new CPU, as described in recent reports, which handles these latest technologies."
Kasahara added, however, that if the follow-up three-year proposal is accepted by the authorities, he would be looking to "develop a test chip by 2012 with one or two companies" tapping the parallelizing compiler and API.
In an e-mail interview, Matthew Wilkins, principal analyst for compute platforms at iSuppli, said the initiative appears to point toward a primary focus on consumer embedded applications, with compute as potential applications at a later date. However, Wilkins said gaining entry to the mainstream server business would be "extremely difficult" if the CPU is not x86-compatible as the de facto operating systems, Linux and Windows, run on x86 architecture.
Nick Jacobs, group PR manager at Intel Asia-Pacific, said the chipmaker does not have details to announce in its processor roadmap that pertain specifically to solar technology. "[However, Intel] has been making significant inroads into power reduction for processors over the years", he said. As of August 2009, the company has invested over US$100 million in clean-tech startups globally.
READ MORE - Japanese researchers downplay super CPU effort

Filipinos turn to Web for typhoon help

Internet users in the Philippines have turned to blogs and social networking sites to initiate relief efforts for victims of Typhoon Kestana, or "Ondoy" as it is known to the local community.
The country was severely hit by the typhoon over the weekend, which saw torrential rains and massive flooding across most of Metro Manila and nearby areas in Luzon.
According to the government, the continuous downpour was the highest recorded in the country since 1967 and has left at least 70 dead and thousands homeless.
Aside from providing up-to-the-minute updates, Filipino online users have started using their blogs, Twitter, Facebook and other social networking sites, to galvanize relief efforts in the wake of the flooding.
Numerous blogs have started pitching in to provide contact details and other pertinent information on the various assistance schemes early Sunday, when the floods started to recede.
One Filipino blogger listed other blogs that posted photos of flooded areas, traffic updates, information on relief goods, as well as online appeals for help to stranded victims. Some blogs used Google Maps to provide data on severely hit areas that should receive priority in relief efforts.
Twitter was also extensively used for the campaign. Numerous tweets were exchanged to provide kept updates on relief efforts. One member initiated a personal campaign to solicit help and maintains a list of relief goods such as water and other basic food items.
The site also manages a list of volunteers that have signed up to help with the distribution of relief goods and relevant information such as contact details, telephone numbers, addresses and locations, where donors and volunteers can coordinate relief efforts. One such group is non-government foundation, WorldvisionFoundation.
READ MORE - Filipinos turn to Web for typhoon help

M'sia probe over Web content raises global ire

International media watchdogs and human rights groups are up in arms over attempts by the country's ICT regulator to "harass" a local political news site into censorship.
The Malaysian Communication and Multimedia Commission (MCMC) had launched an investigation early this month after Malaysiakini carried two video clips which local authorities deemed offensive.
The Web site refused to comply with a Sep. 3 order issued by the MCMC to remove a video showing angry protesters in Selangor state, marching with a severed cow's head to oppose the building of a Hindu temple, and another in which the home minister described the protesters' actions as legal.
The MCMC in a letter stated that the display of both videos "is an offence under Section 211/233 of the Communications and Multimedia Act", which prohibits content that is "indecent, obscene, false, menacing or offensive in character with intent to annoy, abuse, threaten or harass another person". Offenders face a fine of up to 50,000 ringgit (US$14,542) or up to a year in prison.
When contacted, a senior MCMC official confirmed its probe on Malaysiakini is almost complete. "The investigation papers will be submitted soon to the Attorney-General for further consideration," said the spokesperson, who declined to respond to statements made by global and regional media and human rights groups condemning the government's move.
Malaysiakini's co-founder and editor-in-chief, Steven Gan, questioned the MCMC's use of the CMA and cited Section 3 of the document, which stated: "Nothing in this Act shall be construed as permitting the censorship of the Internet."
"So to use the CMA against us would be a clear abuse of the law as [the Act] is not meant to be used in such a manner," Gan said in a phone interview with ZDNet Asia.
However, he expects Malaysiakini to be charged--likely under the CMA--given the "speed and extent of the investigations" by the MCMC.
"[If that happens], we will definitely be defending ourselves," he said. Gan maintains that the two video clips were not offensive, noting that Malaysiakini's intention was to "get the news out" and not to stir religious hatred.
Regional, global lobbyists urge end to probe
New York-based Human Rights Watch last week said the Malaysian government should drop its order for the news portal to remove the videos.
"The government's investigation of Malaysiakini is nothing short of media harassment and it needs to stop," Elaine Pearson, deputy Asia director at Human Rights Watch, said in on the global human rights organization's Web site. "Malaysians are entitled to know all sides of a story. It is not up to the government to approve what news is fit to air, print, or post. The government wants to make the problem disappear by taking the videos off the Internet, but Malaysians have a right to see for themselves what happened and hear what was said--the government shouldn't be suppressing this information."
Reporters Without Borders (RWB) also pledged its support for Malaysiakini, noting that the site is "right to resist the censorship" imposed by the Malaysian government.
"The authorities should understand it is footage that shows something that happened, which may indeed be embarrassing for some authorities but does not constitute an offence," the organization said in a statement posted on its site Sep. 24. "We urge the Commission to set aside its Sep. 3 ruling and in general, we call on the authorities to stop the censorship and intimidation that pushes journalists into self-censorship."
Malaysia is ranked 132 out of 173 countries on RWB's 2008 world press freedom index.
Several media associations across the region have also hit out at the MCMC's probe on Malaysiakini.
Bangkok-based Southeast Asian Press Alliance (SEAPA), Pacific Islands News Association (PINA) and the Philippines-based Center for Media Freedom and Responsibility (CMFR), have written to the MCMC to voice their concerns.
"SEAPA has seen the videos, Malaysiakini's coverage and the circumstances surrounding the same, and we have come to the conclusion that Malaysiakini has done nothing more than cover two legitimate news events," the group's executive director Roby Alampay, said in a letter to MCMC COO Mohd Sharil Tarmizi. "SEAPA calls on the MCMC to cease its questioning of Malaysiakini editors and any further pressures that would violate Malaysiakini's rights and prerogatives to cover events and the news as they see fit."
CMFR's executive director, Melinda De Jesus, called on the MCMC to stop the "harassment" of Malaysiakini and its editors so the site may continue to fulfill its mandate, as a news organization, to deliver news to a public that needs information on events that affect it.
In a letter to the MCMC's Sharil, PINA urged the regulator to "stop its intimidating tactics and harassment" of Malaysiakini's editorial team.
The Malaysian government last month said it had no plans to censor online content, following reports that it was looking at proposals to impose an Internet filter to block "undesirable" content.
Local authorities last year detained political blogger Raja Petra Kamaruddin under sedition charges. Kamaruddin, who is the editor of political portal Malaysia Today, was eventually released in November.
In a report last month, local politicians and industry players said the government's inconsistencies with its treatment of online censorship could have adverse effects on foreign confidence and investments.
READ MORE - M'sia probe over Web content raises global ire

Skype founders pull out the stops

Skype's founders are not letting go of their brainchild without a legal fight. Turns out they mounted a formidable financial battle, too.
At the end of August, Skype founders Niklas Zennström and Janus Friis submitted an 11th-hour bid for the Internet-calling business, BusinessWeek.com has learned. The pair, who had sold Skype to eBay in 2005, offered to buy it back for US$2.1 billion, according to three people familiar with the situation. The offer was made in conjunction with private equity firm Elevation Partners, one of the people said.
Friis and Zennström were trumped, of course, by a rival bid that valued Skype at US$2.75 billion, but their efforts show how far the pair are willing to go to retain control of what makes Skype valuable.
Ratcheting up pressure
Having lost the bidding tussle to a group of investors led by Silver Lake, Skype's founders redoubled their fight in courts. The pair allege that Skype infringes on copyrights owned by their company, Joltid, and filed lawsuits in hopes of forcing Skype to stop using the technology. The lawsuits are not likely to scupper the sale, but they ratchet up pressure on Skype to reach a settlement or find another way of delivering Internet calling, legal experts say.
On September 18, Zennström and Friis took aim at Mike Volpi, a partner at Index Ventures, also part of the group that bought 65 percent of Skype for US$1.9 billion and US$125 million in debt. The lawsuit alleges breach of fiduciary duty, saying Volpi was aided in the successful buyout of Skype by information he obtained while he was CEO of Joost, a company also founded by Zennström and Friis.
Two days before that lawsuit was filed, Joltid filed a lawsuit against the Silver Lake-led group of investors and eBay, alleging copyright infringement.
Growth remains notable
Volpi and representatives for Skype, Silver Lake, and Index declined to comment, as did representatives of Joost and attorneys for companies run by Zennström and Friis. Alan Marks, a spokesman for eBay, said the company is "focused on closing the deal" and expects it to be finalized by year's end.
The Skype stakes are high. In the second quarter, Skype's user base jumped 42 percent from a year earlier, to 480.5 million people, and its sales rose 25 percent, to US$170 million. While growth has slowed from previous quarters, eBay still expects the company to achieve US$1 billion in annual sales by 2011. The new owners aim to accelerate growth by putting Skype on a broader array of mobile devices and pushing it deeper into businesses. "The potential of this company is underestimated," Mark Wiseman, senior vice-president of private investments for the CPP Investment Board, which also invested with Silver Lake, told BusinessWeek.com soon after the purchase was announced.
The founders may not be appeased by a financial settlement and instead may be holding out for a stake in the company, if not a leadership role, said Nitzan Shaer, a managing director at venture fund High Star Group who previously managed Skype's mobile efforts. "They'd like to lead the acquisition themselves," he said. "Now, it's a battle for who controls and leads [Skype]." Joltid and Joost are being represented by Skadden, Arps, Slate, Meagher & Flom.
Technical obstacles for new owners
Though the legal contest will be hard fought, it's not likely to derail the deal. If Silver Lake and its fellow investors walk, they'd have to pay eBay US$300 million. For its part, eBay has agreed to take on 50 percent of any potential damages from litigation. And Skype's founders had alleged copyright infringement before the deal was clinched, so it's not like the bidders weren't aware of the potential legal hurdles.
But Skype's new owners also face technological obstacles. Since the 2005 acquisition, eBay licensed Skype's core technology from Joltid, which terminated the license in March. Since then, eBay has been scrambling to devise a workaround that would let Skype quit using Joltid's technology altogether. A viable workaround would be an important lever in negotiations with Joltid, which said it is suffering damages of US$75 million a day. "If they can do a complete workaround, then things become materially different," said Randolf Katz, partner at law firm Baker & Hostetler.
With his deep technology expertise, Volpi may be instrumental in developing different approaches to calling. "Clearly, there are other services like Skype out there," said Joyce Kim, chief marketing officer at Web-calling technology company Global IP Solutions, whose customers include networking giant Cisco Systems, where Volpi used to be chief strategy officer. "Certainly, there are other ways to do what they are doing."
Settlement is possible
Skype could even modify its service so it works similarly to Web-calling rival Vonage, experts say. It could develop the technology in-house, or even buy a Web-calling startup, such as damaka. Volpi left his CEO position at Joost to join Index Ventures in June. While at Cisco, he helped the networking giant acquire more than 70 companies. "Skype isn't out of options," said Jeffrey Lindsay, an analyst at Sanford C. Bernstein. "It's not a hopeless case by any means."
What's more, Skype will have months more to finesse the workaround. The lawsuit made public September 16 "is unlikely to get to trial until a year or two from now," said Rod Dorman, a partner at Hennigan, Bennett & Dorman, which represents Joltid. The lawsuit against Volpi and Index is tricky, and could take time as well, as proving breach of fiduciary duty and resulting damages is typically more difficult than proving copyright infringement, Katz said. The parties could still settle before then; one possibility is that Skype may be offered an option to acquire Joltid, with the founders getting a piece of the equity in Skype in exchange.
READ MORE - Skype founders pull out the stops

Asian enterprises want 'packaged clouds'

Enterprises looking to go to the cloud quickly do not want piecemeal cloud components, according to Parallels, a virtualization vendor that is betting its strategy on customer demand for "packaged clouds".
Jan-Jaap Jager, Parallels' Asia-Pacific general manager, said in an interview Thursday that enterprises in the region are interested in moving to the cloud, but want the transition to be "plug and play".
To address this, the vendor is hoping its "automation" packages will trump the competition. It is tying up with resellers using Parallels middleware to enable them to bundle "complete" clouds, he said.
On competitor VMware's platform-as-a-service (PaaS) efforts, such as vSphere 4, he said these pieces of technology are "enablers" for clouds and do not compete directly with Parallels' offerings.
Need for standards still a barrier
According to Jager, the existence of multiple "standards" in the cloud scene is forcing vendors to pick sides. This is affecting the confidence of users in some countries, he said.
Companies in Japan, for example, are hesitant about the cloud, compared to counterparts in other parts of Asia and the West, because the cloud is not "perfect" yet, he noted. The industry's numerous players today continue to grapple with multiple standards, preventing the industry from reaching this "perfection", he explained.
Standardization across the industry will help its players come out with better and wider services, he said, noting that Parallels sponsors the APS (application packaging standard) model, which aims to enable applications and services providers to integrate their cloud offerings.
However, the APS itself appears to be one of several cloud "standards". Red Hat's Deltacloud project, for example, was launched earlier this month to create an API that will let developers write applications for deployment across different clouds.
The Open Cloud Standards Incubator (OCSI) group was also created this year by industry bigwigs IBM, Microsoft Citrix and Hewlett-Packard, also aiming to facilitate interoperability in the cloud.
Jager, however, said the proliferation of various "standards" would not confuse customers because companies do not pick vendors by which standards they support. He added that standards exist to help cloud services providers offer more applications through interoperability, which in turn attracts customers that want vendors able to offer a wider selection of applications.
READ MORE - Asian enterprises want 'packaged clouds'

Asian firms still like their desktops

Despite increasing competition from notebooks and newer form factors such as netbooks, the desktop is not dead--and won't be anytime soon, say industry observers in the region.
According to IDC data, portable systems accounted for 45 percent of total PC shipments in the Asia-Pacific region, excluding Japan, in the second quarter of 2009. During the same period last year, portable PCs contributed just 35 percent of total shipments.
Reuben Tan, IDC's Asia-Pacific senior manager of personal systems research, said in a phone interview that market growth for portable systems has been driven largely by mini-notebooks or netbooks, as well as the ultra-thin form factor.
Tan noted that the introduction of more form factors within the portable space further underscores the shift toward notebooks, with vendors seeing higher notebook sales and offering more notebook models.
This growing demand is also reflected in the commercial space, where deployment of portable PCs saw a 15 percent sequential growth in the second quarter.
Despite the momentum driven by notebooks, however, there are still growth drivers pushing the desktop space, said Tan. "There is [definitely] a lot of activity happening in the commercial notebook space, but…there is still ongoing demand for commercial desktops," the analyst said.
Desktop demand is "still very strong" in terms of commercial deployment, particularly in more conservative sectors such as government, he said.
In e-mail interviews with ZDNet Asia, vendors Hewlett-Packard (HP) and Lenovo echoed IDC's sentiments.
Dennis Mark, vice president and general manager for desktop systems at HP Asia-Pacific and Japan's personal systems group, said the company is seeing "strong growth" in its desktop business across the region, particularly in emerging markets.
"Desktops remain an integral part of an enterprise or SMB's (small and midsize business) infrastructure due to their superior power, storage, processing and graphics capabilities," Mark said.
There are also specialized uses for different desktop form factors, he added. Workstations, for example, are ideal in certain vertical markets such as engineering or animation, while in general office, medical, financial, educational or call center environments, thin clients may offer the best desktop experience.
Chong Tze Sing, Lenovo's commercial desktop product manager for Southeast Asia, said: "Desktops and notebooks have different usage scenarios so they do not necessarily compete with each other. Factors like mobility and cost can largely determine if a user, corporate or consumer, decides on a desktop or notebook."
"Within the commercial sector, desktops are still in demand due to data security and cost," Chong said. Commercial desktops currently in favor are priced between US$400 and US$600, and the trend toward this price range is expected to continue, he added.
According to Chong, Lenovo sells more notebooks than desktops in much of the Southeast Asian region, except in Vietnam and the Philippines, where the purchasing power of consumers is lower.
Helping desktops live on
IDC's Tan noted that desktops have evolved to serve niche groups within the consumer and commercial markets. "PC vendors are trying to differentiate the desktops as much as possible, such that this form factor doesn't die an early death," he said.
One end of the consumer market spectrum is occupied by entry-level affordable PCs such as Atom-based nettops, targeted at consumers who are cost-conscious or purchasing a desktop as a secondary PC. On the other end of the market are high-end users who prefer to configure their own PCs, or are serious gamers that demand the best performance.
Tan added that all-in-one desktops or multimedia centers that tap touch technology have also gained traction.
Similarly, within the commercial desktop space, there are also varied models that differ in feature sets, price points, chassis size and even support levels, which cater to different users and IT budgets, he added.
"Dollar for dollar, for an equivalent level of performance…in terms of bang for your buck, a desktop still makes sense in many instances," noted Tan.
READ MORE - Asian firms still like their desktops

App store riches Flash in the pan?

Increasing interest in monetizing apps will bring developers back to multi-device, "code once, run everywhere" platforms, according to Adobe. But, an analyst says this is not a foolproof plan.
Ryan Stewart, Adobe platform evangelist, said in an interview with ZDNet Asia that the chase to monetize apps, especially on the Apple iPhone, has resulted in the majority of developers getting edged out by the handful of more popular apps that are raking in more revenue.
The initial "gold rush" has been tempered with developers realizing that Apple's App Store is not a get-rich quick scheme, Stewart said.
"Developers realize they need marketing to [monetize their apps]," he said. "It's now more complex. So these developers have come back to Flash."
He said the problem of dealing with disparate handsets will grow, giving Adobe a foothold in offering developers a way to broaden their audience through apps that support multi-platforms.
Tim Renowden, devices analyst at Ovum, said in an e-mail interview: "Developers tend to go where the audience is."
"Good support" from hardware manufacturers for Flash and Flash Lite will provide a nudge in Adobe's direction toward developers, Renowden said, but ne noted that the proprietary Flash technology still presents a barrier for developers.
"Developer support [for Flash] has so far been limited," he said. "We may see some resistance to proprietary technologies if developers feel they can deliver comparable experiences on more standards-based technologies."
In fact, Opera's CEO remarked earlier this year that the revision of the HTML 5 open Web standard could render Flash redundant.
Nonetheless, with "many of the major handset manufacturers" having announced plans to deploy widget frameworks across their portfolio offerings, Renowden said he expects to see mobile widgets becoming more popular in the next 12 months.
Stewart acknowledged that Flash support is present on comparably fewer smartphones, compared to lower end models. "We would have liked to be faster, but there's still room for someone to make [an impact in apps development]," he said. "We are concentrating more on high-end smartphones now to close the gap."
Multi- and native platform dilemma
Renowden said widget platforms are still appealing and will grow more popular due to the increasing variety of phone platforms that developers want to target.
Furthermore, the bulk of the global handsets market still resonates in lower-end models with limited processing power and capabilities, he said. Widgets help address this market by offering a "relatively lightweight method" of pushing Web services to this segment.
While the Ovum analyst said widgets will help developers save time catering to multiple platforms, he noted that adjustments and tweaks still need to be done for different devices. "Multi-platform widgets can simplify the delivery of cross-platform functionality, but not completely address fragmentation issues," he said.
Developers will still be forced to choose the specific platforms according to their needs, he noted. "The current fragmented state of the mobile software industry is unlikely to change in the medium term and multi-platform technologies such as widgets, are not going to be able to match the richness of user experience that developers can achieve with native applications," he explained.
Stewart said the challenges of keeping the user interface consistent across different devices remains, but Adobe hopes its impending release of Flash 10 for smartphones will help bring a richer experience to high-end devices.
Adobe announced in February plans to bring the full-fledged version of Flash player--which runs on PCS--to smartphones running OSes such as Windows Mobile, Android and Symbian next year.
Developers will still have to build two versions of their apps for full Flash and Flash Lite, which runs on lower-end phones, but they can "reuse the same underlying skill sets", he added.
Another gap Adobe is looking to close is the ability to better use of a device's onboard features, he said. Native apps have the benefit of utilizing a phone's functions such as a GPS chip, he explained. The feature is not possible right now, however.
For other developers, browsers are another avenue to pushing services onto mobiles. Yahoo, for instance, offers a "code once" platform it calls Blueprint.
Unlike the Flash platform, however, Blueprint is targeted at sites, not apps. It aims to extend the compatibility of rich media sites with devices, a strategy that varies from Flash's app-centric focus.
In order to accommodate different device capabilities, the platform "degrades the experience" to suit individual consumer devices, said a Yahoo spokesperson.
In an e-mail interview, the spokesperson said the Internet giant has grown its Blueprint team over the last two years, and is focused on extending Yahoo's products to mobiles.
However, the Internet company acknowledged the buzz around apps, saying browsers have come "second" to that, and added that developers interested in targeting specific platforms should write native apps instead.


READ MORE - App store riches Flash in the pan?

Storage goes through the looking glass

Almost overnight, storage has evolved from being rather limited and expensive to something remarkably cheap and virtually without limit.

We are at an immense turning point in IT. Almost overnight, storage has gone through the looking glass, changing from something rather limited and expensive to something remarkably cheap and virtually without limit.
We've been waiting for IT to turn this corner since 1980 or so. In 1983, a 20 MB hard disk cost as much as a used Volkswagen, and PCs generally had around 16 MBs of RAM at most. Storage was scarce. Storage space on the chip and on disk had to be husbanded, squeezed, pruned, and reallocated immediately for maximum performance.
For 25 years, software designers faced the same challenge when it came to storage: More memory meant faster performance, but space for code in memory was always extremely limited by the hardware's physical constraints.
Designers usually had to settle on a middle position between maximum performance and optimal code size. Applications whose code was too large to fit on the free space on the chip tended to run too slowly because of the massive paging the size of the code triggered.
The first disk drives I encountered in the late '60s were the size of large washing machines. The drives weren't as noisy as washing machines, but a roomful of them (you needed a roomful to get enough storage for your IT operations) could just about drown out a conversation. IBM rented (never sold) the drives to customers, and so the size helped justify the hefty price. The disks were in a stack inside a kind of semi-transparent cylindrical cake carrier almost 20 inches in diameter.
Apple brought out the Mac in 1984, with an operating system at least 10 years ahead of Microsoft's MS/DOS, but a hard disk (which was optional) for the minuscule Mac cost as much as a new Volkswagen. Storage was priced beyond the consumer's reach, so most Macs never had a hard disk; this means the Macs ran at a tenth of the speed for which they were built.
Then a few years ago, drive technology turned a corner and took us through the looking glass to a place that's just the opposite, where there's almost infinite space to profit from in ways that will speed up your code. The new challenge to the entire IT industry is: Figure out how our software can be improved to take advantage of this suddenly unlimited disk space.
With unlimited disk storage and massive RAM, a totally new architecture is necessary. The basic structure of software and database has to be reconsidered, for example. For more than 40 years, almost all database systems have been using some form of data compression forced on the designer by the limited space available to him on hard disk and in memory. Now, suddenly, no form of data compression is needed for any operation on a computer, making that compression obsolete.
The new storage technology doesn't just bring freedom for the programmer, it also brings a new level of difficulty to the job of backup. No big change is without its price. As someone once pointed out, "Believing in progress is believing in getting something for nothing."
Twenty years ago, many IT pros had most of their hard disk backed up at any time. Today, with these gigadrives packed with monster applications, you'll have a hard time finding anyone who has most of his hard drive space backed up in a recent copy somewhere.
The extra drive space can also be used to increase the frequency of backups; have you done that yet? Perhaps you used to keep seven generations of backup on tape. With megadrives, you can keep 100 generations just as easily or 700 for that matter.
The extra drive space can also be used to build historical depth into databases and data warehouses. Records can be flagged as deleted instead of being physically deleted, and historic copies retained when records are updated, meaning the user obtains the power to look at the history of the values in any field of the record.
With that architecture in place, "Show me every address where this person has ever lived" becomes a matter of a single click. Most corporate databases currently can't even answer this kind of query because of lack of retention of past versions of records due to perennial storage scarcity.
This challenge of using the new wealth of space available to us is one we'll be looking at in subsequent posts.
READ MORE - Storage goes through the looking glass

How Intel's supercomputer almost used HP chips

More than a decade ago, Intel ran into an issue trying to deliver what was to be the world's top-ranked supercomputer: it looked possible that its new Pentium Pro processors at the heart of the system might not arrive in time.
As a result, the chipmaker made an unusual move by paying Hewlett-Packard US$100,000 to evaluate building the system using its PA-RISC processors in the machine, said Paul Prince, now Dell Computer's chief technology officer for enterprise products but then Intel's system architect for the supercomputer. Called ASCI Red and housed at Sandia National Laboratories, it was designed to be the first supercomputer to cross the threshold of a trillion math calculations per second.
Intel ultimately met that 1-teraflops performance deadline using the Intel chips, HP dropped its PA-RISC line in favor of Intel's Itanium processor line, and the Pentium Pro paved the way for Intel's present powerhouse status in the server market. But the supercomputing division within Intel was phased out, and ASCI Red was its last job, Prince said in an interview here on the eve of the Intel Developer Forum.
The division had enough independence that it could have used another company's chips, but doubtless eyebrows would have been raised had a rival processor design showed up in such a high-profile machine that ultimately used more than 9,000 processors.
It was not the only hurdle the Intel group overcame in the design and construction of ASCI Red, which used ordinary processors but plenty of one-off technology including a customized operating system and Intel's own router chips to send data from through the system.
The first version of the router chip had a data integrity problem, and Intel didn't have time to fully validate a fixed version even though the engineers knew what caused the problem, Prince said. However, in a presentation titled "Statistics for the Common Man", Prince convinced Intel management that a variety of worst-case scenario tests could reduce the validation time from more than a dozen weeks to about four to six weeks. He prevailed.
"It worked, and they didn't fire me," Prince said. ASCI Red, developed for the Energy Department's Accelerated Strategic Computing Initiative to simulate nuclear weapons physics in a computer rather than with real-world tests, led the Top500 list of supercomputers from June 1997 until November 2000, when IBM's ASCI White took the top spot.
Meanwhile, in today's world
Naturally Prince now is focused on the best directions for getting Dell servers, storage, and networking gear into customers' hands. And though he's comfortable with nitty-gritty chip details, he said customers these days are gravitating toward higher-level discussions.
"At this point nobody's keeping up with the gigahertz rating of chips," he said, no doubt, to the delight of Intel and AMD, who ran into physical limits on clock speed and focused their attention on multiple processing cores and getting more work done in each tick of a chip's clock.
Instead, he said, customers are asking, "How does this fit into my virtual environment? What's my management look like?" Thus, Dell is leading a lot of marketing with virtualization, which lets a single physical computer house many independent operating systems called virtual machines. Dell had expected Microsoft and various Linux players to challenge virtualization expert and EMC subsidiary VMware, but it's withstood the competition so far, he said.
Dell itself has about 6,000 VMware-hosted virtual machines running on about 620 real machines in its own computing infrastructure, but that is only a small fraction of the 12,000 physical servers total the company has. Some physical machines house as many as 20 virtual machines, but for business-critical tasks, Dell puts 10 virtual machines on a physical server, Prince said.
In Dell's analysis, using virtual machines saved US$60 million in capital equipment expenses, he said. But virtualization poses problems, too--the virtual equivalent of server sprawl, in which new servers are added to a company's infrastructure faster than administrators can keep up.
"You can deploy new servers in hours instead of weeks. The downside is you crank 'em out, so you have this proliferation of resources," Prince said, and virtual machines don't come with handy tracking technology. "The reason it's hard to get rid of them is it's hard to track them. There's no asset tag. There's no depreciation on a virtual server."
Hardware still matters
Though sales have moved to a higher level, hardware details still matter, Prince said. One he is most excited about is solid-state drives, which use flash memory rather than the spinning platters of conventional hard drives.
Many SSDs today directly replace hard drives, using the same size and SATA or SAS communication protocols to connect to a machine in a way that makes them interchangeable with conventional hard drives. But Prince is more interested in a technology that bypasses that older hard drive technology in favor of a more direct connection over a computer's PCI Express subsystem.
Companies including Fusion-io and Texas Memory Systems supply the technology, and Prince is among those in the server realm who like the idea. "You can get a massive performance upgrade in terms of IOPS," or input-output operations per second.
He is also a believer in a technology called wear leveling, which moves data around the physical storage device so no elements do not get overused and therefore effectively worn out. "The life has to be better than enterprise-class drives," he said.
Prince also predicted the eventual triumph of Ethernet over more special-purpose high-speed network fabrics, Fibre Channel and InfiniBand. Fibre Channel will reach 16 gigabits per second, probably will not move beyond 40 gigabits per second, but Ethernet is headed for 40 and 100 gigabits per second today with 400 gigabits and even 1 terabit per second on the horizon, he said.
"Everybody is converging on Ethernet as the high-performance fabric of the future," Prince said.
READ MORE - How Intel's supercomputer almost used HP chips

Confirm your Project Schedule is ready for EVM

This eight-step checklist will help you determine whether your Microsoft Project Schedule is ready for earned value management.

Over the past nine years, I've been studying earned value management (EVM) and its application to IT projects. According to PMI's Project Management Body of Knowledge, EVM is an "objective method to measure project performance in terms of scope, time and cost." If you're unfamiliar with EVM, read my overview before proceeding with EVM metrics in Microsoft Project.
EVM metrics are easy to calculate, as it only requires simple math. According to industry research, there are 40 critical success factors to successfully implement EVM in an organization. One of those critical success factors is the project management team can develop a project schedule and track against a project baseline. A properly developed Microsoft Project Schedule is critical in calculating EVM metrics when using Microsoft Project. A poorly defined Project Schedule will result in poor EVM metrics.
The following is a quick checklist to confirm your Microsoft Project Schedule is ready for EVM.
  1. Confirm the Work Breakdown Structure (WBS) has been correctly defined with the appropriate work packages
    Meaningful earned value metrics are only as relevant as the tasks they represent. If you've built a poorly defined WBS, earned value can't help you.
  2. Confirm the work resources have the appropriate hourly rates in the Resource Sheet view
    Microsoft Project determines the work package's budget based on each assigned work resource's hourly rate. If you don't have a defined cost for each resource, your budget will be zero.
  3. Confirm all tasks have resources assigned at the lowest level in the WBS
    A task establishes its baseline budget and planned value (PV) based on the resource costs assigned to the task. By ensuring resources are assigned to the lowest level in each work package, the cost will roll up appropriately.
  4. Confirm no resources are over allocated
    Despite our desire to give 110%, a realistic schedule has resources only allocated 100% of the time.
  5. Confirm the total costs for each work package match contractual agreements
    Microsoft Project builds a time-phased budget for each task and allocates planned work and costs according to the project calendar. If your project contracted for $100,000, the total cost in the WBS should reflect $100,000.
  6. Confirm the project has a project baseline
    The project baseline establishes the time-phased budget for the project and allows PV to be calculated. If you don't baseline the project, you can't measure what you're trying to manage.
  7. Confirm the project management staff understands the procedures to record actual start, finish, duration, and costs data for the project schedule
    As the project progresses, it is important that anyone updating the project schedule follows a consistent procedure to record start dates and actual costs. If teams use inconsistent practices, your earned value results may be off.
  8. Confirm the project status date is set in Microsoft Project before examining EVM metrics
    If you don't specify the project status date, Microsoft Project will assume the current date is the date used for calculating the earned value metrics. Since status reporting usually follows the prior week, you'll want to set the project status date each week to calculate accurate metrics.
There are 39 other critical success factors that organizations should consider before implementing EVM across an organization.
For more information about how to use earned value with Microsoft Project, read my tutorial, How to Calculate EVA in Microsoft Project.
READ MORE - Confirm your Project Schedule is ready for EVM

Broadband over power lines? Don't count on it, yet

Dominant carrier PLDT, as well as its high-profile chief executive Manny V. Pangilinan, have been in the news lately because of an allegation from a senator that former Pres. Joseph Estrada forced local tycoon Alfonso Yuchengco to sell his shares, so Pangilinan's group can take over the telco giant in the 1990s.

I won't go far in discussing this issue since the nation is pretty familiar with it already. What I'd like to touch on is PLDT's recent maneuverings at power distributor, Meralco. Based on news reports, PLDT has coalesced with the Lopez family to form the block that now controls the power firm.

PLDT's entry into Meralco has provided a little sideshow--a boardroom battle between PLDT and beverage conglomerate San Miguel Corp. (SMC), which is now diversifying into various industries, including the energy sector.

Pangilinan's alliance with the Lopezes was crucial in the sense that it allowed Manolo Lopez to remain as Meralco CEO, while thwarting the takeover attempts of SMC at the same time.

This was also significant because SMC has bought into Liberty Telecoms, a new industry player which main financial backer is Qatar Telecom. PLDT, of course, is the country's dominant carrier and it would have been stupid for the company not to react as SMC slowly crept into its turf.

But, observers say the bigger prize in which PLDT was more concerned about is a new technology called broadband over power lines (BPL), which allows subscribers to just plug in their computers in electrical outlets to gain access to high-speed Internet.

I'm speculating here but SMC, through Liberty Telecoms, might have introduced--or laid the groundwork--this early for BPL, if it succeeded in taking control of Meralco. I said "might" because Liberty hasn't disclosed what kind of telco offerings it will offer to the public. This could have been the easiest way for Liberty to make an impact in the highly competitive telco space.

I'm not saying that PLDT is not keen on bringing in BPL to the country, but it's foolish not to think that it may want to delay or even refuse to deploy it until after it has recovered its investments on its other broadband technologies.

To be fair, BPL has not been extensively implemented in any part of the globe, including Europe or the United States. In a recent Cisco Systems event in Incheon, South Korea, I had the opportunity to talk to Euro Beinat, a professor from the Center for Geoinformatics at the University of Salzburg. He said BPL is an innovative technology but it may take a few more years before it becomes pervasive, since a number of technical compatibilities between power and telephone lines have yet to be resolved.

For PLDT and SMC, it's interesting to see what role BPL will play in their continuing quest for dominance in the Philippine market.

UPS firms, NAIA needs your help
The shameful, and costly, radar failure that shut down the Ninoy Aquino International Airport (NAIA) a few days ago has been blamed by officials on the old UPS facility that failed to respond when a power outage hit the airport.

According to report, the UPS backup is already ten-years-old and is now past its prime. Bureaucratic redtape has prevented the government from purchasing a replacement, officials were quoted as saying.

Perhaps, UPS firm APC should hurry up in appointing a country manager for its Philippine office so they can offer NAIA a good package to spare the country with another possible radar breakdown.

It's already been sometime since Pascal Bodin, the erstwhile chief of APC Philippines, has left the country, but the company has yet to name his replacement.
READ MORE - Broadband over power lines? Don't count on it, yet

Being Twitter deft or Twitter dumb: is that the question?

Social networking has assumed immense importance in our lives. Over the last few months, we have seen how a tweet, a profile status or personal information on Facebook can cost some people their job, marriage, image and much more.

In fact, there is an ongoing debate on whether blogs, tweets and other information we get from Facebook is "news" or not.

Is social networking a part of the new fourth estate? In that case, everyone who tweets or blogs is a journalist. And everything pertaining to journalism--from what is news to how responsible is the reportage--will have to be redefined.

Well, I am sure this debate won't end in a hurry.

Recently, a tweet posted by Shashi Tharoor, minister of state for external affairs, took up much newsprint and news time. The government of India is on a major austerity drive and has been asking ministers and bureaucrats to travel economy class (while traveling domestically).

Tharoor responded to a journalist's query on Twitter on whether he will travel to Kerala in "cattle class" (read: economy class). In response, Tharoor said he would travel "cattle class out of solidarity with all our holy cows!"

What followed was a big hue and cry over this tweet. In India, cows are (indeed) holy and politicians are expected to toe their party's line. "How dare he refer to economy class as cattle class?" "Why is he insulting cows and economy class travelers?" "Tharoor must resign". "We should take appropriate action against him". These were just some of the reactions reported in the media.

Are we going to judge our ministers by what they say on Twitter? Is this reason enough to sack someone? How about sacking people who are corrupt and inefficient?

While this controversy was on, television channels were running from pillar to post to get reactions of various Congress party and opposition party members. There were debates on every channel. Did we really need to make such a big deal out of a 140-character microblog?

Thankfully, Prime Minister Manmohan Singh brushed Tharoor's remark as a joke, even though a Congress chief minister demanded Tharoor's resignation and the party spokesperson said the Congress party would take "appropriate action at appropriate time". Expectedly, the opposition Bharatiya Janata Party also slammed Tharoor for the "cattle-class" tweet.

Tharoor ultimately apologized, on Twitter! "It's a silly expression but means no disrespect to economy travelers, only to airlines for herding us in like cattle. Many have misunderstood," Tharoor said, in a tweet.

He also explained that by the word "holy cows" he was not referring to any individual. "Holy cows are NOT individuals but sacrosanct issues or principles that no one dares challenge. Wish critics would look it up."

The minister said he had learnt a lesson from the episode. "I now realize I should not assume people will appreciate humor. You should not give those who would willfully distort your words an opportunity to do so." Yesterday, Sonia Gandhi, the Congress party president, asked Tharoor to tweet with tact.

While Tharoor may have learnt his lesson, I wonder what lessons we can draw from India's "Twittergate". Do we need to lay down a new set of etiquettes for social networking Web sites? Should we keep humor and sarcasm out of our tweets? How Twitter dumb would that be? And how can we apply such etiquettes to citizens of other countries (who would read, re-tweet and respond to our tweets)? Such (self-imposed) guidelines will give politicians more reason to stay away from Twitter and remain inaccessible and unanswerable to the commonman.

I wonder if there are any easy answers to the various Twittergates happening all across the globe. But one thing is for sure--as more people get hooked on to social networking Web sites, we will see more controversies. And news will continue to get redefined.

In the end, the winner (in all such controversies) is social networking sites like Facebook and Twitter. And Twitter savvy people like Tharoor (who has over 2,08,000 followers post the "cattle class" controversy). Never underestimate the power of those 140 characters!
READ MORE - Being Twitter deft or Twitter dumb: is that the question?

Microsoft to be heard on Word injunction appeal

Microsoft will have its day in court this week.
OK, so the software maker still spends lots of days in court, even if it has settled many of the antitrust cases that once filled its Outlook calendar. It will make its case to an appeals court for why it shouldn't face an injunction banning sales of Word that contain a custom XML feature.
Earlier this year, a federal jury found that recent versions of Word infringe on a patent held by I4i and ordered Microsoft to pay the Canadian company US$200 million. Last month, a federal judge hiked the damage award and also ordered the injunction.
Both sides have made their arguments (and counter arguments, and counter-counter arguments abundantly clear), so now it will be up to the federal appeals court to weigh those positions.
For its part, I4i has said it is not seeking to have Word pushed off the shelves entirely. It just wants the offending code removed.
If it loses its appeal, Microsoft could try to offer an XML feature that behaves differently, pull the custom XML feature from Word, or pursue some sort of settlement.
READ MORE - Microsoft to be heard on Word injunction appeal

APAC firms pass on remote admin via phones

Businesses based in Asia have not yet warmed up to the idea of remote management of IT systems via mobile devices, according to industry watchers.
Vikram Chandna, head of desktop practice and alliances at Wipro Technologies' infrastructure management services unit, told ZDNet Asia in an e-mail that the company has "yet to see any substantial practical application" for mobile-based systems administration. The same can be said for Wipro's customers, he added.
"One reason we reckon for the low adoption is that…the [systems administration] tools are typically installed at the offshore delivery centers or at remote monitoring and support centers, like Wipro's Global Command Center," he said, noting that the constantly-changing set of human resources at the remote delivery site tends to discourage adoption of such tools.
Chandna added that the use of mobile phones for systems administration tasks is still "niche" and should take about three to five years to become more mainstream.
Over at systems integrator NCS, all remote systems administration is performed on notebooks and desktops. In an e-mail, Bok Hai Suan, NCS' director of corporate information systems, said that industry acceptance of remote systems administration on mobile platforms appears to be "on the rise".
Using handhelds to perform system administration has its advantages, Bok pointed out. Mobile phones can provide a faster response time which is critical during an emergency, and it allows system administrators the ability to address issues that crop up outside of working hours, without having to be in the office.
On the other hand, mobile screens may prove prohibitive, in comparison to PC counterparts. "The screen size of mobile devices today may be too small for comfort to input command line instructions, although GUI (graphical user interface) windows may still be manageable."
In addition, mobile devices lack the "additional layer of physical security" that workstations in an office environment are subject to, said Bok. The form factor also calls for greater care in managing access control as it provides an additional avenue for "disgruntled or careless system administrators" to remotely interrupt or shut down networks.
Outside of Asia, there have been positive experiences with remote administration using smartphones. Scott Lowe, CIO at Westminster College in Missouri, the United States, and blogger for TechRepublic, ZDNet Asia's sister site, described in a recent blog post how he used his Apple iPhone to perform server administration during a particular upgrade task.
READ MORE - APAC firms pass on remote admin via phones

Huawei, Alcatel-Lucent to manage S'pore NBN

SINGAPORE--Huawei and Alcatel-Lucent have been selected to manage and provide the "active infrastructure" for the country's planned next-generation national broadband network (NBN).
The two vendors signed an agreement Tuesday with the NBN's appointed operating company (OpCo), Nucleus Connect.
The StarHub subsidiary won the bid to operate the active infrastructure of the NBN, providing wholesale broadband connectivity to downstream operators such as retail service providers (RSPs), which would then package and resell broadband services to consumers.
This "active" layer sits atop the basic fiber infrastructure, currently being laid out by a separate network company (NetCo) OpenNet, a joint venture between four companies that includes Singapore Telecommunications.
Earlier this month, Alcatel-Lucent also clinched the OSS/BSS provider role with OpenNet.
Speaking at a briefing here, Nucleus Connect CEO David Storrie said Alcatel-Lucent's OpenNet deal had no bearing on its win with the OpCo. He said the company's "open access" design of the network was a factor, adding that Alcatel-Lucent won by a "considerable" margin.
Storrie said Alcatel-Lucent initially submitted bids on both Nucleus Connect tenders calling for OSS/BSS and networking infrastructure vendors, but was "found to be stronger in OSS/BSS". Alcatel-Lucent later linked up with Huawei to come forward as a joint bid, he added.
And while Nucleus Connect is the "official" NBN OpCo, it is unlikely to be the only OpCo, he noted. "I do expect at least another one," he said.
"At least a dozen RSPs are interested" in signing on to be another OpCo for the Singapore NBN, Storrie said, pointing to the number of RSPs that had signed NDAs (nondisclosure agreements) to view the Interconnection Offer (ICO) on wholesale prices, as listed by the country's ICT regulator, the Infocomm Development Authority (IDA).
Daniel Tang, CTO of Huawei's network product line, told ZDNet Asia that the Huawei-Alcatel-Lucent contract with Nucleus Connect spans seven years, during which the Chinese networking equipment vendor will provide and operate the necessary systems, as well as transfer skills to Nucleus Connect's staff.
Nucleus Connect will also set up two "super central offices" to house the telecoms exchange equipment, Tang said in a phone interview.
Storrie said the sizes of the central office facilities will be determined by the number of service providers that require co-location.
READ MORE - Huawei, Alcatel-Lucent to manage S'pore NBN

Flash cookies: What's new with online privacy

If you thought refusing HTTP cookies prevented tracking, think again. Web site developers have found a way.

Web site hosts and advertisers do not like relying on HTTP cookies, as users have now figured out how to avoid them.
According to security expert Bruce Schneier, Web site developers now have a better way. It's still considered a cookie, yet it's different.
LSO, a bigger better cookie
Local Shared Object (LSO) or Flash cookie, like the HTTP cookie, is a way of storing information about us and tracking our movement around the Internet.
Some other things I learned:
  • Flash cookies can hold a lot more data, up to 100 Kilobytes. A standard HTTP cookie is only 4 Kilobytes.
  • Flash cookies have no expiration date by default.
  • Flash cookies are stored in different locations, making them difficult to find.
YouTube test
LSOs are also hard to get rid of. Here is a test proving that. Go to YouTube, open a video, and change the volume. Delete all cookies and close the Web browser. Reopen the Web browser and play the same video. Notice that the volume did not return to the default setting. Thank a Flash cookie for that.
Not many know about Flash cookies and that is a problem. It puts people who configure their Web browser to control cookies under a false sense of security. As shown earlier, privacy controls have no effect on Flash cookies.
Where are they stored
Flash cookies use the extension .sol. Knowing that, I still wasn't able to find any on my computer.
Thanks to Google (which uses Flash cookies), I determined the only way you can access information about resident Flash cookies is by going to Flash Player's Web site.
The following slide is from the Flash Player Web site and shows my storage settings. The visited Web sites (total of 200) shown in this tab all have deposited Flash cookies on my computer. This tab is also where the Flash cookies can be deleted, if so desired.

Flash cookies are rampant
Another Google search brought me to a report by University of California, Berkeley researchers. Flash Cookies and Privacy describes what the researchers found after capturing Flash cookie data from the top 100 Web sites.
Here are the results:
  • Encountered Flash cookies on 54 of the top 100 sites.
  • These 54 sites set a total of 157 Flash shared objects files yielding a total of 281 individual Flash cookies.
  • Ninety-eight of the top 100 sites set HTTP cookies. These 98 sites set a total of 3,602 HTTP cookies.
  • Thirty-one of these sites carried a TRUSTe Privacy Seal. Of these 31, 14 were employing Flash cookies.
  • Of the top 100 Web sites only four mentioned the use of Flash as a tracking mechanism.
It appears many Web sites use both HTTP and Flash cookies. That surprised/confused the researchers. After more digging they found the answer, respawning.
Flash cookie respawning
UC Berkeley researchers determined that HTTP cookies deleted by closing the browser session were rewritten (respawned) using information from the Flash cookie:
"We found HTTP cookie respawning on several sites. On About.com, a SpecificClick Flash cookie respawned a deleted SpecificClick HTTP cookie. Similarly, on Hulu.com, a QuantCast Flash cookie respawned a deleted QuantCast HTTP cookie."
The researchers also found Flash cookies were able to restore HTTP cookies for more than one Web site domain:
"We also found HTTP cookie respawning across domains. For instance, a third-party ClearSpring Flash cookie respawned a matching Answers.com HTTP cookie. ClearSpring also respawned HTTP cookies served directly by Aol.com and Mapquest.com."
It gets better
Awhile ago, I wrote a piece about how Google started using behavioral targeting (BT) after originally saying they wouldn't. In that article, I mentioned the Network Advertising Initiative (NAI), a consortium of approximately 30 companies that use BT technology. Bowing to pressure, the group created an opt-out page making it simple to prevent tracking.
The researchers found that setting the opt-out cookie wasn't enough. Web sites belonging to the NAI created Flash cookies anyway. The report refers to one specific incident:
"We found that persistent Flash cookies were still used when the NAI opt-out cookie for QuantCast was set. Upon deletion of cookies, the Flash cookie still allowed a respawn of the QuantCast HTML cookie. It did not respawn the opt-out cookie. Thus, user tracking is still present after individuals opt out."
Some solutions
To prevent Flash cookies from being stored, switch to the Global Storage Settings tab in the Setting Manager and remove the check for "Allow third-party Flash content to store data on your computer" as shown in the following slide:

That is supposed to prevent Flash cookies from being installed. Ironically, we have to take the word of the Flash Web site.
For the tests, researchers used Mozilla Firefox. In the report, they mentioned BetterPrivacy, a Firefox add-on that removes all flash cookies when the Web browser is closed. Another Firefox add-on Ghostery raises alerts about any hidden scripts that track Web presence.
Final thoughts
I thought we were past unannounced tracking of our movements on the Internet. If the technology is so innocent, make tracking an opt-in feature.
READ MORE - Flash cookies: What's new with online privacy

US Justice Dept: Google's book settlement needs rewrite

The U.S. Department of Justice late Friday urged the court overseeing Google's book search settlement with authors and publishers to reject the settlement in its current form, although it strongly hinted that the parties are flexible on certain provisions.
"As presently drafted, the Proposed Settlement does not meet the legal standards this Court must apply," the DOJ said in a 32-page filing with the U.S. District Court for the Southern District of New York. "This Court should reject the Proposed Settlement in its current form and encourage the parties to continue negotiations to modify it so as to comply with Rule 23 (a federal law governing class-action settlements) and the copyright and antitrust laws."
After Google was sued in 2005 for digitizing out-of-print yet copyright protected books by several groups representing authors and publishers, the parties settled out of court in October 2008. That deal granted Google sweeping rights to scan and display out-of-print books. Ever since the settlement was announced, opposition has mounted to what one University of California at Berkeley professor recently called "the largest copyright licensing deal in U.S. history." Opponents claim that Google and the plaintiffs overstepped their bounds in assigning the company the sole right to make digital copies of out-of-print books that are still protected by copyright law.
"The Proposed Settlement is one of the most far-reaching class action settlements of which the United States is aware; it should not be a surprise that the parties did not anticipate all of the difficult legal issues such an ambitious undertaking might raise," the DOJ wrote in its filing.
The DOJ has been looking into antitrust concerns stemming from the fact that Google and the nonprofit Books Rights Registry set up to handle payments to authors would have sole control over the pricing of institutional subscriptions to the digital library. But in its filing, it also raised questions about whether the settlement complies with Rule 23 of the Federal Rule of Civil Procedure as well as copyright law in general. "In the view of the United States, each category of objection is serious in isolation, and, taken together, raise cause for concern."
Still, the DOJ noted that a digital library of books holds important benefits for society, a point that has been repeatedly raised by Google's supporters, who argue that it would improve access to knowledge. It would appear that the DOJ, however, would prefer Congress settle the thorny issues of copyright laws that apply to orphan works--books whose rightholders cannot be located but which can be scanned by Google under the agreement--rather than making policy through legal settlements.
"As a threshold matter, the central difficulty that the Proposed Settlement seeks to overcome - the inaccessibility of many works due to the lack of clarity about copyright ownership and copyright status - is a matter of public, not merely private, concern. A global disposition of the rights to millions of copyrighted works is typically the kind of policy change implemented through legislation, not through a private judicial settlement," the DOJ wrote.
Reaction to the DOJ's filing allowed parties from all sides of this issue to claim victory.
"The Department of Justice's filing recognizes the value the settlement can provide by unlocking access to millions of books in the United States. We are considering the points raised by the Department and look forward to addressing them as the court proceedings continue," Google said last week in a statement.
The Internet Archive, perhaps Google's most vocal opponent in this matter, was likewise pleased. "Despite Google's vigorous efforts to convince them otherwise, the Department of Justice recognizes that there are significant problems with terms of the proposed settlement, which is consistent with the concerns voiced with the Court by hundreds and hundreds of other parties," the Open Book Alliance said in a statement.
"This is a major agreement, and it is entirely appropriate for DOJ to look at a deal of this magnitude," said Ed Black, president and CEO of the Computer & Communications Industry Association, which has supported the settlement. "They are doing their job scrutinizing the competition aspects of this settlement."
And Consumer Watchdog, a strident opponent of the settlement, also found something to like in the DOJ's filing:

Consumer Watchdog supports digitization and digital libraries in a robust competitive market open to all organizations, both for-profit and non-profit, that offer fundamental privacy guarantees to users. But a single entity cannot be allowed to build a digital library based on a monopolistic advantage when its answer to serious questions from responsible critics boils down to: "Trust us. Our motto is 'Don't be evil.'"
Google will learn whether it has earned that trust from Judge Denny Chin on October 7 in New York. That is, unless the settlement is modified in the coming weeks, in which case we could be looking at several more weeks of debate.
READ MORE - US Justice Dept: Google's book settlement needs rewrite