- Joined
- Apr 8, 2005
- Messages
- 8,011
- Reaction score
- 58
Several technologies that have been percolating around the edges of mainstream business will bubble up to the surface this year, and CIOs and IT managers need to be prepared for the opportunities they represent--or risk getting burned.
Radio frequency identification will begin to ramp up the data loads IT centers must handle, as the tags become more pervasive. Web services will present workaday challenges, as managers are tasked with integrating Web-based apps into enterprise computing systems. The cost savings promised by server virtualization will be too compelling to pass up. Graphics processing will get a boost from the advent of Microsoft's Vista operating system. And as far-flung workforces face new and more troubling threats, mobile security will be more of a challenge.
RADIO FREQUENCY IDENTIFICATION
"What ERP did to the enterprise, RFID will do to the supply chain," says Marlo Brooke, senior partner at Avatar Partners, a systems integrator. "It's all about centralization, visibility, and automation."
RFID isn't new, having been around in one form or another for more than a decade. Over the last several years, Wal-Mart and the Department of Defense have helped move RFID into the mainstream, using the technology to track everything from pallets to people to pill bottles, and insisting that their partners adopt it as well. RFID standards are solidifying, making it easier to develop applications and interoperate various pieces. Products such as Reva Systems' Tag Acquisition Processor make it easier to funnel RFID data directly into inventory, manufacturing, and supply chain systems.
There are challenges. An RFID deployment needs to take into account potential radio frequency issues and how wireless networks are deployed across an organization. Also, warehousing and inventory experience are needed to collect the scanned information and integrate it into existing supply chain applications. The IT shops that embrace RFID will have to be able to handle the massive data dumps the technology generates, route this data to the right places within their applications infrastructure, and be able to act on this information as part of their decision-support systems.
WEB SERVICES
Last year saw plenty of buzzwords describing the "Webification" of business applications: blogs, mashups, the rewritable Web, RSS feeds, software as a service, social networking spaces, Web 2.0, and wikis. Buzzwords or not, the Web has become a solid delivery platform for applications, and Web services will change the way we deploy enterprise software.
The trick is in paying attention, because the Web services movement is producing better and more capable enterprise-class applications that can be deployed in a fraction of the time more traditional apps can. IT managers can use combinations of Web-based applications to piece together what they need. For instance, you can take a mapping service such as Yahoo or Google Maps and tie in the location of your current sales leads to determine where to deploy your sales force.
Utilize that RFID data--fast
Many of these begin with one or more hosted applications. There's Zimbra for hosting enterprise-class E-mail, Amazon.com's S3 for off-site disk storage, Concur for expense reporting, and Jive Software's Clearspace for document and workflow management, to name just a few.
"Hosted applications [give me] a new and more flexible opportunity for providing application solutions to my clients," says Dan Parnas, a director at online brokerage Charles Schwab. "They have significantly lower up-front cost and the ability to bring the application online relatively quickly."
The good news about software as a service, says Doug Neal, a research fellow with Computer Sciences Corp.'s Leading Edge Forum executive program, is that it provides a software architecture and business model that can meet the growing need for agility. "We've seen this movie before with the invention of the PC," he says. "Resistance was futile then and it's futile now."
SERVER VIRTUALIZATION (FOR FREE!)
The concept behind virtual machine software is simply stated but hard to implement: Divvy up a single server into separate virtual machines, each with its own memory, virtual hardware, drive images, and other resources. It also isn't new--IBM has been doing this on its mainframes for more than 30 years. What's new is that the power of VM technology now can be exploited on Intel-based servers, an underutilized asset in most data centers.
The idea is to run multiple operating systems and applications in the same box, making it easier to provision a new server and make more productive use of the hardware. Unlike the mainframe era, having multiple VMs means IT shops can cut the cost of software development and simplify configuration as they deploy new servers. "Two years ago, it wouldn't have been possible to handle so much workload in a data center," says Rene Wienholtz, the CTO of Strato, a German Web-hosting provider that has deployed virtualization software.
Karen Green, the CIO of Brooks Health System, is also a believer in virtualization. "We plan to use virtual server management to reduce our server support efforts, minimize downtime, and reduce the ongoing costs of server replacement, enabling us to support more hardware with existing staff," she says.
The fact that Microsoft and EMC are giving away their virtual machine software, along with preconfigured VM applications known as virtual appliances, makes a strong argument for investigating the advantages of the technology. Microsoft offers a virtual disk image that contains XP with Service Pack 2 and Internet Explorer 6, for those shops that need to run IE 6 and 7 side by side.
The network perimeter is gone. That means companies need to protect themselves from the outside world, not only from intruders but also from infected insiders. The hitch is in delivering consolidated mobile and end-point security across a company that will cover multiple desktop operating systems, nondesktop network devices such as print servers, and various switch and router technologies. That's a tall order, especially as most IT shops already have some collection of perimeter security devices that will need to work with whatever end-point solution is put together.
Most networks authenticate users via logon credentials but don't examine the actual desktop or laptop hardware the user's running. So extra steps are needed to scan the file system for any Trojans or key-logging programs, check the patches and antivirus signature files that have been installed to see if they're up to date, and, if not, take steps to fix what's wrong. There are several proposed responses. Microsoft has its Network Authentication Protection architecture, and Cisco Systems has one called Network Access Control; each covers slightly different aspects of end-point security. Juniper Networks and other networking vendors offer authentication systems under the Trusted Network Connect architecture from the Trusted Computing Group. The architecture uses open standards and taps into the "trusted" hardware chips incorporated in most new laptops.
Some shops aren't waiting for the architecture to settle. The Fulton County, Ga., government is moving forward with Microsoft's NAP software and began trials about 10 months ago with beta copies of Vista and Longhorn. The county is using IPsec authentication, and its NAP deployment checks for a series of health requirements, including making sure that the version of Norton's antivirus client is current before giving out an IP address to its network for remote users.
ADVANCED GRAPHICS PROCESSING
Two developments are changing the nature of graphics in business computing: greater use of 3-D images, and the use of graphics processors for computation. Not only are more applications making use of 3-D, operating systems are using 3-D elements as part of their basic tasks. Microsoft's Windows Vista is a good example. One of the most highly touted features of Vista is its "Aero glass" interface, which layers see-through elements on top of each other. But it doesn't come cheap: Aero requires 128 Mbytes of dedicated graphics memory, at minimum (256 Mbytes is better).
Andy Keane, the general manager of GPU computing for graphic chipmaker Nvidia, says he's seen greater adoption of 3-D graphics as a visualization tool in the oil and gas, medical imaging, and computer-aided design industries. Three-dimensional graphics are part of the basic function set for leading interactive applications, Keane adds. "3-D isn't just about games."
The new graphics cards being developed by Nvidia and ATI (now a part of Advanced Micro Devices) may have a bigger impact on computational processing than the latest chips from Intel and AMD. As graphics processors become more powerful, they're able to offload computational functions from the computer's main central processing unit. Nvidia has had a program for several years to assist developers who want to harness their graphics engines for computational applications. Keane says he's seen applications that could run only on racks of clustered servers comfortably fit on a single workstation, such as Acceleware's electromagnetic simulation software that's used to design cell phone antennas.
What this means for IT managers is that graphics processing is a key component of their PC strategies and needs to be managed just as carefully as the software and CPU resources. It also means that the days of buying PCs with graphics capabilities integrated on the motherboard are probably numbered, as this configuration doesn't deliver enough performance.
FULL STORY
Radio frequency identification will begin to ramp up the data loads IT centers must handle, as the tags become more pervasive. Web services will present workaday challenges, as managers are tasked with integrating Web-based apps into enterprise computing systems. The cost savings promised by server virtualization will be too compelling to pass up. Graphics processing will get a boost from the advent of Microsoft's Vista operating system. And as far-flung workforces face new and more troubling threats, mobile security will be more of a challenge.
RADIO FREQUENCY IDENTIFICATION
"What ERP did to the enterprise, RFID will do to the supply chain," says Marlo Brooke, senior partner at Avatar Partners, a systems integrator. "It's all about centralization, visibility, and automation."
RFID isn't new, having been around in one form or another for more than a decade. Over the last several years, Wal-Mart and the Department of Defense have helped move RFID into the mainstream, using the technology to track everything from pallets to people to pill bottles, and insisting that their partners adopt it as well. RFID standards are solidifying, making it easier to develop applications and interoperate various pieces. Products such as Reva Systems' Tag Acquisition Processor make it easier to funnel RFID data directly into inventory, manufacturing, and supply chain systems.
There are challenges. An RFID deployment needs to take into account potential radio frequency issues and how wireless networks are deployed across an organization. Also, warehousing and inventory experience are needed to collect the scanned information and integrate it into existing supply chain applications. The IT shops that embrace RFID will have to be able to handle the massive data dumps the technology generates, route this data to the right places within their applications infrastructure, and be able to act on this information as part of their decision-support systems.
WEB SERVICES
Last year saw plenty of buzzwords describing the "Webification" of business applications: blogs, mashups, the rewritable Web, RSS feeds, software as a service, social networking spaces, Web 2.0, and wikis. Buzzwords or not, the Web has become a solid delivery platform for applications, and Web services will change the way we deploy enterprise software.
The trick is in paying attention, because the Web services movement is producing better and more capable enterprise-class applications that can be deployed in a fraction of the time more traditional apps can. IT managers can use combinations of Web-based applications to piece together what they need. For instance, you can take a mapping service such as Yahoo or Google Maps and tie in the location of your current sales leads to determine where to deploy your sales force.
Utilize that RFID data--fast
Many of these begin with one or more hosted applications. There's Zimbra for hosting enterprise-class E-mail, Amazon.com's S3 for off-site disk storage, Concur for expense reporting, and Jive Software's Clearspace for document and workflow management, to name just a few.
"Hosted applications [give me] a new and more flexible opportunity for providing application solutions to my clients," says Dan Parnas, a director at online brokerage Charles Schwab. "They have significantly lower up-front cost and the ability to bring the application online relatively quickly."
The good news about software as a service, says Doug Neal, a research fellow with Computer Sciences Corp.'s Leading Edge Forum executive program, is that it provides a software architecture and business model that can meet the growing need for agility. "We've seen this movie before with the invention of the PC," he says. "Resistance was futile then and it's futile now."
SERVER VIRTUALIZATION (FOR FREE!)
The concept behind virtual machine software is simply stated but hard to implement: Divvy up a single server into separate virtual machines, each with its own memory, virtual hardware, drive images, and other resources. It also isn't new--IBM has been doing this on its mainframes for more than 30 years. What's new is that the power of VM technology now can be exploited on Intel-based servers, an underutilized asset in most data centers.
The idea is to run multiple operating systems and applications in the same box, making it easier to provision a new server and make more productive use of the hardware. Unlike the mainframe era, having multiple VMs means IT shops can cut the cost of software development and simplify configuration as they deploy new servers. "Two years ago, it wouldn't have been possible to handle so much workload in a data center," says Rene Wienholtz, the CTO of Strato, a German Web-hosting provider that has deployed virtualization software.
Karen Green, the CIO of Brooks Health System, is also a believer in virtualization. "We plan to use virtual server management to reduce our server support efforts, minimize downtime, and reduce the ongoing costs of server replacement, enabling us to support more hardware with existing staff," she says.
The fact that Microsoft and EMC are giving away their virtual machine software, along with preconfigured VM applications known as virtual appliances, makes a strong argument for investigating the advantages of the technology. Microsoft offers a virtual disk image that contains XP with Service Pack 2 and Internet Explorer 6, for those shops that need to run IE 6 and 7 side by side.
The network perimeter is gone. That means companies need to protect themselves from the outside world, not only from intruders but also from infected insiders. The hitch is in delivering consolidated mobile and end-point security across a company that will cover multiple desktop operating systems, nondesktop network devices such as print servers, and various switch and router technologies. That's a tall order, especially as most IT shops already have some collection of perimeter security devices that will need to work with whatever end-point solution is put together.
Most networks authenticate users via logon credentials but don't examine the actual desktop or laptop hardware the user's running. So extra steps are needed to scan the file system for any Trojans or key-logging programs, check the patches and antivirus signature files that have been installed to see if they're up to date, and, if not, take steps to fix what's wrong. There are several proposed responses. Microsoft has its Network Authentication Protection architecture, and Cisco Systems has one called Network Access Control; each covers slightly different aspects of end-point security. Juniper Networks and other networking vendors offer authentication systems under the Trusted Network Connect architecture from the Trusted Computing Group. The architecture uses open standards and taps into the "trusted" hardware chips incorporated in most new laptops.
Some shops aren't waiting for the architecture to settle. The Fulton County, Ga., government is moving forward with Microsoft's NAP software and began trials about 10 months ago with beta copies of Vista and Longhorn. The county is using IPsec authentication, and its NAP deployment checks for a series of health requirements, including making sure that the version of Norton's antivirus client is current before giving out an IP address to its network for remote users.
ADVANCED GRAPHICS PROCESSING
Two developments are changing the nature of graphics in business computing: greater use of 3-D images, and the use of graphics processors for computation. Not only are more applications making use of 3-D, operating systems are using 3-D elements as part of their basic tasks. Microsoft's Windows Vista is a good example. One of the most highly touted features of Vista is its "Aero glass" interface, which layers see-through elements on top of each other. But it doesn't come cheap: Aero requires 128 Mbytes of dedicated graphics memory, at minimum (256 Mbytes is better).
Andy Keane, the general manager of GPU computing for graphic chipmaker Nvidia, says he's seen greater adoption of 3-D graphics as a visualization tool in the oil and gas, medical imaging, and computer-aided design industries. Three-dimensional graphics are part of the basic function set for leading interactive applications, Keane adds. "3-D isn't just about games."
The new graphics cards being developed by Nvidia and ATI (now a part of Advanced Micro Devices) may have a bigger impact on computational processing than the latest chips from Intel and AMD. As graphics processors become more powerful, they're able to offload computational functions from the computer's main central processing unit. Nvidia has had a program for several years to assist developers who want to harness their graphics engines for computational applications. Keane says he's seen applications that could run only on racks of clustered servers comfortably fit on a single workstation, such as Acceleware's electromagnetic simulation software that's used to design cell phone antennas.
What this means for IT managers is that graphics processing is a key component of their PC strategies and needs to be managed just as carefully as the software and CPU resources. It also means that the days of buying PCs with graphics capabilities integrated on the motherboard are probably numbered, as this configuration doesn't deliver enough performance.
FULL STORY