CFO Europe Selects Grid as a Top Technology of 2002

IT budgets are tight, but there are some technologies that companies can’t ignore. Anthony Sibillin reveals what every savvy CFO should have on their radar screen.

December 2001/January 2002

“New technology buzzwords will be kept to a minimum next year and corporate IT investment will focus on projects that make a solid business case,” pronounces Klaus Elix, the chief technology officer of AMS Europe, a consultancy. While that might actually be music to the ears of many CFOs, the underlying message is a gloomy one for businesses that have grown accustomed to throwing lots of euros at new technology over the past few years.
Do other consultants, analysts, vendors and venture capitalists share Elix’s assessment? It seems so. CFO Europe spoke to a host of technology experts to see what they have to say about technology trends in 2002. What we heard was widespread agreement about the economy in general—each and every contributor replied to our queries with a recessionary lament coupled with a mention of September 11th.
With good reason. There is plenty of evidence to suggest that companies are rejigging their IT budgets in response to the dismal economy and the ongoing terrorist threat. Accordingly, two old buzzwords—return-on-investment (ROI) and security—will be heard more frequently next year.
That’s not to say that there isn’t anything to look forward to in the world of enterprise technology. In today’s environment, it’s clear that some technologies that might have sat on the shelf for another year will be put on a “fast-track” rollout across the enterprise, replacing formerly rising stars. After all, the basic tenets of capitalism are still in place in Europe. Firms, as always, are under pressure to lower costs, improve information flows around the organisation and protect their valuable assets. And they will use technology to take the heat off.
To help companies do that, we’ve identified ten technologies that are worth getting to know. *



Connecting two applications over the internet—the foundation of e-commerce—should be fast and cheap. Instead, says Forrester, a research firm, it can take many months to achieve and it can soak up as much as E1m in software and integrators’ fees.
The good news is that this could all change in 2002 when web services enter mainstream enterprise computing. Applications created as web services are built using industry standards, such as XML and SOAP. This common “vocabulary” means web services running on different computer platforms can communicate with one another without expensive “middleware”.
The bad news is that not all web services speak the same “dialect”. Essentially, there are two camps. Microsoft, predictably, is responsible for one half of the schism. The software colossus claims its .NET web services framework sticks to industry standards. Critics counter that .NET doesn’t, making it difficult for users to switch to Java-based web services, .NET’s main competitor. And while the coalition around Java, which includes Sun Microsystems and IBM, is widely held to be more faithful to open standards, it still leaves companies with an unenviable choice to make.
The uncertainty leads Alexander Linden, a research director at Gartner, to warn companies against “creating mission-critical web-services projects until a clear case for revenue-generation opportunities can be made”.
Yet that shouldn’t stop companies from using the year ahead to put in place a framework for web services, insists Paul Jones, at consultancy KPMG Metrius. After all, he says, once issues such as standards and security are addressed, these services will reduce dramatically the cost of adding new functionality and consolidating IT systems. *



In his latest film, AI, Steven Spielberg envisions a day when boy-robots fool humans into thinking they’re real. Robert Hecht-Nelsen also foresees a time when machine and human become indistinguishable. The difference is, while Spielberg makes fictional movies, Hecht-Nelsen makes real software used by some of the world’s biggest companies.
His California-based company, HNC Software, has been putting artificial intelligence (AI) to practical business use for some 15 years. Its software is based on the idea of neural networks, which associate concepts and data in the same way the human brain does. By unleashing this technique on millions of past transaction records, HNC helps companies predict the future behaviour of “good” and “bad” customers. The good customers detected by the technology are likely to give a company more business and provide up-selling and cross-selling
AI tools will also soon go one step further. At the moment, human salespeople are responsible for interacting with a company’s good customers. But in the future, predicts Hecht-Nelsen, the task will be handed over to cheaper, conversational computers. “This would make it possible for every consumer to enjoy the same kind of personalised service the British Queen does today,” he enthuses.
Given the growing threat of terrorism, however, companies and governments will be more interested in using AI to stop bad customers. Already, an alliance between HNC and PROS Revenue Management, which supplies pricing optimisation software to 15 major airlines, will help carriers flag suspicious flight bookings for further investigation. And both HNC and UK-based rival Searchspace are pushing AI technology aggressively as a weapon in tackling money laundering by terrorist organisations. *
Don’t create “mission-critical web- services projects until a clear case for revenue-generation opportunities can be made”. Alexander Linden, research director, Gartner.
In its early days, the power from grid computing will come in handy for genetic research, weather modelling and other super-computing tasks.



Over the past few months, IBM has been busy unveiling several iterations of its “Grid” computing architecture, a system that tackles complex problems by harnessing the processing power of many computers connected by a high-speed network.
In its early days, such power is likely to come in handy for genetic research, weather modelling and other traditional supercomputing tasks. IBM is building a Grid for America’s National Science Foundation, which will perform 13.6 trillion calculations per second, making it 1,000 times faster than IBM’s chess-playing Deep Blue. According to John Patrick, vice-president of internet technology at the US computer giant, Grid computing should begin entering mainstream corporate computing next year. Among the sectors that have an immediate need for Grid are
the pharmaceutical, energy and car industries.
For firms that shy away from the expense of building their own Grid, IBM points out that its Grid can be used like a power plant, dispensing services as needed. Patrick says the business case for such a utility-like approach is a strong one for two reasons. First, the Grid offers economies of scale. Second, companies will be spared the unpredictable performance of public networks, such as the internet, because the Grid is designed to handle spikes in demand.
While IBM’s own global services unit is a logical candidate to operate a Grid utility, it faces competition from niche start-ups like Cluster Solutions of France (financial services) and Swiss-based Gridcomputer (life sciences). *



There is almost universal agreement that wireless networking will be a hot technology next year. But that consensus ends abruptly when it comes to predicting which wireless standard will dominate 2002—Bluetooth, 802.11b (or Wi-Fi) or GPRS.
In many respects, of course, there isn’t much room for debate. Each standard serves a different need. Bluetooth provides a wireless link between devices up to ten metres apart, 802.11b up to 100 metres and GPRS between any two points on an ordinary GSM mobile phone network. The latter’s superior reach, however, must be set against the faster data-transfer dates offered by 802.11b and Bluetooth’s low cost.
In this light, the contest is really about which standard will beat the others in finding a home inside European organisations. Looking to the US for guidance is of limited use. While 802.11b, and now the speedier 802.11a upgrade, quickly won over American firms, there are good reasons why it may lose some charm this side of the Atlantic.
For a start, Europe has its own home-grown alternative to 802.11. Called HiperLAN-2, it offers, some wireless experts say, better support for streaming media and interactive applications than 802.11. But European users are in a bind: 802.11-based products are in the shops, but no one knows when the 5-GHz frequency both standards use will be allocated by national regulators. As for Bluetooth and GPRS, there’s no doubt about their place in the radiomagnetic spectrum. But whether they’ve won over corporate Europe remains to be seen.
In any event, another development in 2002 might turn the standards dilemma on its head. Innovative companies such as US-based Embedded Wireless Devices are releasing devices capable of concurrently supporting multiple standards and seamlessly switching between them. In that case, firms can invest in wireless networking without fear of being locked in to a standard that doesn’t get a regulatory stamp of approval. *



There is still no secure way to make small, one-off payments on the internet without a credit card. That leaves some 60% of consumers reluctant to make purchases over the internet, says Jupiter Research. This reluctance sank many internet businesses that were forced to rely on site advertising alone for revenue. Moreover, says Jakob Nielsen, a principal of the Nielsen Norman Group, an internet consultancy, it forced them to lower the quality of the “free” content and services they offered, undermining customer confidence further. However, Nielsen, who is considered in many circles to be the web’s foremost usability expert, predicts 2002 could be the year when the “user-pays” model finally makes its way to the internet. He reckons countries like Denmark and Sweden will lead the pack. “These countries are small enough so that all of the major websites and internet service providers can get together and settle on the necessary technology to allow users to pay for services over the net,” Nielsen says.
In theory, this should address some of the main objections to alternative payment systems: there are too many of them and they are incompatible. In practice, there is no guarantee a spirit of co-operation will hold, let alone extend into more competitive markets such as the UK and France. Still, there are signs that a viable alternative to credit cards exists with programmes like Yahoo! PayDirect, a joint venture between the internet portal and HSBC, the UK-based bank. PayDirect users register their credit card or bank account with HSBC. They can then transfer money to and from their PayDirect account, which can be used to buy goods and services from participating merchants or other individuals with a PayDirect account. The next challenge will be to find a way of linking Yahoo!’s programmes with rivals like PayPal and Microsoft’s Passport. *



Email is a lousy business tool. Less secure than a postcard, it takes up valuable time as staff sift the wheat from the chaff, only to hide the wheat in a swamp of poorly organised in-boxes. In short, email is the curse of the modern organisation, except for one thing: it is an integral part of the way business is done. In fact, the failure of expensive knowledge management systems is proof that attempts to wean employees off person-to-person tools like the telephone and now email, are doomed.
So instead of fighting the inevitable, a number of innovative companies are putting a new spin on the internet’s killer application. UK-based DespatchBox, for example, offers an inexpensive plug-in to encrypt some of the 9 billion unprotected email messages currently sent around the world each day. Another British company, K-Vault Software, sells an email archiving software, which was originally developed at computer maker Compaq. The software works with Microsoft Exchange, the world’s most popular email system, to build a fully searchable repository for email and email attachments.
If companies need any more incentive to get their email systems in order, the increasing use of text messaging in businesses provides it. Already well established in the mobile consumer world, text messaging will add another pile of entries to employees’ in-boxes. To avoid letting even more “knowledge capital” seep out of their business, companies will need to bring knowledge management to the tools people actually use, rather than vice versa.



Collaborative commerce (or c-commerce) has as many meanings as there are analysts and vendors with an interest in defining it. An alternative way of getting to grips with the concept, however, is to take a quick look at the history of enterprise software.
Beginning in the early 1990s, enterprise resource planning (ERP) software took off as a way to automate the back office of big companies. By the late 1990s, B2B e-commerce was supposed to link that automated back office with those of customers and suppliers to form a “virtual” marketplace. The problem was most companies found they didn’t want to play in an anonymous marketplace after all. They wanted to work with their old partners, but do it more efficiently by moving transactions and other interactions online. C-commerce covers the plethora of standards and technologies becoming available to support this need.
The building blocks are XML-based standards for linking computer systems, such as xBRL in the financial arena. To these, c-commerce adds a layer of collaborative tools based on voice, instant messaging, email, video, mobile devices and, of course, the world wide web.
For this reason, a c-commerce evangelist at JD Edwards, Nick Rawls, expects customer relationship management (CRM) to lead the c-commerce charge in 2002. Judith Dixon, a marketing manager at Syntegra, a systems integrator, agrees, adding that the aim is “client, product and market information, contained in CRM systems, existing in-house systems or third-party applications, can be accessed either at a desktop in the fixed world or via PDAs while on the road.” *



The September 11th terrorist attacks forced companies to take another look at their existing list of IT priority spending. And if security wasn’t already at the top of the list, it is now. That’s led many experts to predict 2002 will see the rise of unbreakable security. The idea is that unbreakable security is not a single product, but a collection of related technologies that can deliver a level of security well above what might have been acceptable before.
One of the big securities technologies is smart cards, already familiar to many Europeans. Most retail banks offer customers simple versions of these computers-on-a-bit-of-plastic. Following September 11th, however, expect both companies and governments to show renewed interest in sophisticated smart cards, which incorporate biometric details such as fingerprints and eye patterns. “The need to identify employees, travellers and citizens rapidly, both to protect corporate assets and to return to some level of revenue growth in the travel industry, will be the driving cause for this movement,” says Rob Enderle, a research fellow at Giga, a research group.
Corporate IT departments will also look to introduce unbreakable security to the back office. Information infrastructures designed to withstand any failure or intrusion will become de rigueur.
There are already products that help on this front. Oracle’s Virtual Private Database, for example, pushes user authentication all the way down to individual rows in a database table. So even if hackers break into the application level above, the underlying database will still be protected. Unbreakable security, says Norman Green, financial director at Oracle UK, Ireland and South Africa, is about reducing downtime which, according to Standish Group, can cost an organisation from E2,700 to E8,500 a minute. *



Convergence—the tantalising prospect of carrying voice, data and video over a single digital pipe—has been topping the hit-parade list of hot technologies for a decade. Frustratingly, it has never left those charts for the mainstream. The year ahead, however, will be different, say experts. They cite two developments—September 11th and the launch of GPRS mobile data services in Europe—that could transform convergence from a nice-to-have technology into a must-have.
Granted, the economics of a unified communications network have never been in doubt. But corporate inertia, combined with an “if it ain’t broke don’t fix it” mentality, has killed most convergence projects, with the exception of some greenfield sites where the cost benefits are even clearer. The recent terrorist attacks, however, upset this equilibrium. Applications that take advantage of a converged network, like video conferencing, have suddenly become popular. As has the idea of decentralising corporate headquarters operations after years of trying to centralise everything. The more greenfield sites and corporate offices there are to link, the more attractive the idea of a converged wide area network (WAN), for example, becomes.
The other driver in the equation is mobile data. The launch of commercial GPRS services in Europe at the end of this year will make many business users comfortable with the idea of a single network for voice and data. More important, it will entice companies to put in place the foundations of a converged infrastructure so that field workers and sales staff are provided with mobile access to enterprise applications. Vendors like Cisco and Commworks, a 3Com company, are ready to pounce on this new opportunity and sell networking products off that foundation. Commworks, for example, is marketing what it calls “softswitch technology”. By using software rather than hardware to route all types of traffic, softswitch allows a single network infrastructure to carry any medium—voice, data, fax or video. *



Unlike all the other technologies covered in this overview, autonomic computing won’t be ready for implementation in 2002. But because it neatly encapsulates what all technologies strive for, we’ve decided to include it in our line-up of top technologies.
Autonomic computing is basically a reaction to the growing complexity of enterprise computing. These days, a computer network is a complex architecture run by thousands of lines of code, which is not only hard to use, but also hard to manage. It’s this complexity that “threatens to undermine the very benefits information technology aims to provide”, says Paul Horn, senior vice-president of IBM Research, which is backing the concept.
For a way out, autonomic computing has turned to one of the most complex systems of the human body for inspiration—the nervous system. “Consider the autonomic nervous system,” Horn says. “It tells your heart how many times to beat and checks your blood’s sugar and oxygen levels. But most significantly, it does all this without any conscious recognition or effort.” He thinks it is time to design and build computing systems that hide complexity from the user in the same way.
Work is already under way to put flesh on this still fuzzy concept. Current research projects at labs and universities include self-evolving systems that can monitor themselves and adjust to certain changes, “cellular” chips that are capable of recovering from failure, and heterogeneous workload management that balances and adjusts workloads of many applications over various servers.