Zum Inhalt springen


Pocket App Review
Tablets, phones and applications – all here.


October 3, 2013

Choosing The Right Help Desk

rosOpen source help desk software was created in order to provide convenience to the business owners. Since it is a tool wherein customers can raise their concerns easily, it will not be hard on the part of the owners to manage their business well. They can offer 24/7 customer service support via the internet and they can have an overview on their strengths as a business. However, choosing the right open source help desk can be a difficult process. There are several online sources that provide help desk software for free. With numerous benefits and extraordinary features, it can be confusing to install the right one.

In order to be guided accordingly, choose the help desk software that meets your needs. First, know your business well and select the software that offers flexible features. Whether your business is small or big, it would not hard to provide customer support. Second, consider the disadvantages of the software. If it is greater than the benefits you could get, do not download it. The software might not be fit for your business. Lastly, know the source of the help desk software. Even if it is given for free, do not download particular software that comes from a bogus site. You can always have the best open source help desk if you do a lot of research in the internet.

Maximizing The Open Source Help Desk

One of the best tools to ensure customer satisfaction is the open source help desk. This software is very helpful in maintaining the trust and loyalty of customers especially that there is a tight competition in the market. It provides an area where customers can raise their feedback and ask help from the business owner. Since it is installed in a particular page of the website, business owners can be assured that their customers are valued. They can efficiently do their tasks and provide the best products or services as possible. Aside from that, business owners can manage everything well because of the open source help desk.

The good thing with help desk is it provides more advantages than disadvantages. However, one must choose the right software for the business in order to maximize it fully. There is thousands of free help desk software available in the internet and it can difficult to choose which one works best. To be directed on the best help desk, doing a lot of research is recommended. Always compare the software to the other as well as the source of it. Open source help desk can be used to increase the revenue of the business and it boosts the confidence of the customers in the long run.

Tags: ,
July 8, 2013

Tips On Finding A Medical Billing Training Course

mbtaThere are programs available that offer just billing training but you want to consider a training program that gives you both billing and coding in one training program because coding is really the first step in the process of electronic medical billing. Some universities offer medical coding and billing courses which help prepare students for jobs managing patient information and insurance claims. Students learn electronic medical billing and coding software, medical terminology, and how healthcare clinics and facilities work. These courses can prepare one to take the Certified Coding exams where required. For example, one college offers a 29 week master certificate program that includes ICD-9 and CPT-4 coding.

Students take courses in medical terminology, anatomy and physiology, and medical ethics and law. This college offers classroom training and students can participate in an externship program where they can practice their newly learned billing and coding skills.  You will find that some of the schools offer online courses and good self-study formats, important if you plan to study part-time or while you are holding down another job. There are also quite a number of online medical billing training courses offered by various institutions. There is much to be learned about on-campus or online medical billing training before you commit. Get more details at www.medical-billing.net

February 5, 2013

Predicted Computing Megatrends Prove True

Multimedia is the first megatrend that we will examine. Multimedia really means multiple data types or formats, not multiple media. Pragmatically, the term “multimedia” has been used to define computers with the capability to play sound and rich graphics sequences. Video and sound files are enormous relative to classical computer data. CD-ROM has been the only economical storage media and device capable of handling such files. Therefore, today’s multimedia computers all feature CD-ROM drives. I think of multimedia as computing’s attempt to mirror every form that human thought can take.

Like it? It's vaporware, silly.

Like it? It’s vaporware, silly.

Computers are, if nothing else, the best device we have found for embodying human thought. Thought is not just an invoice or an image, not just a sound or a voice. It is a process. It is algorithms. It is logic. It is insight. We can also solidify our thought processes into programs that can be repeated again and again with no wear and tear on our neurons. The demand for increasing multimedia capability will continue until computers are able to input, manipulate and output virtual realities that are indistinguishable from “real” reality.

But long before that happens, computers will be able to create virtual worlds with rich human sensory inputs that have before only been experienced in dreams and imagination. Some day we may indeed experience “Star Trek’s” Holodeck and more. Does this sound like the stuff of science fiction? So did today’s computers just before the turn of the last century. The other two computing megatrends, connectivity and mobility, are so close to being equally the second most important that I could argue either way. Let’s look at connectivity first. Our need to be connected with each other and with information systems and databases has spawned enormous markets for computer-based communication products and services including: E-mail, voice-mail, servers, LANs, WANs, intranets, Internet and wireless NICs with all the necessary datacom paraphernalia from satellites to RJ-45 jacks not to mention countless versions of driver software.

Most of us, if we or our company could afford it, would choose to be connected most of the time that we are using our computer. At other times we might use our computer just so that we could be connected. We would want not only to be connected within our own organization, but also to the Internet as the gateway to the rest of our world, and to information services that we use frequently. We would want to be connected while at work, at school, at home, on airplanes, in hotels, in meeting rooms and even in between.

In a perfect IT world, we would roam carefree between wired and wireless connections, never worrying about costs or cables. All this so that we could work, learn and play together without restrictions of space, time or form. With the World Wide Web and other Internet applications, our expectations for connectivity are being redefined at amazing speed and with delightful imagination. The Web phenomena happened because connectivity creativity landed in the hands of individuals and small groups. It has not required enormous capital investment by those who would innovate applications. It has not been controlled. The last time such an outrageously enchanting usurpation of technology occurred was with the personal computer.

As PC applications have enriched human horizons, so will the Web. I don’t see the Web as replacing the PC as some would hope. Rather, I see the Web exploding because of, and on top of, PCs. I see the Web as enabling exponential experimentation on top of PCs, which have been and will continue to be exponentially innovative.

At the same time we need to be connected within rich, shared information environments such as those that are being born today on the Web, we also need to create alone and to communicate from unlikely places. The office or even home where our high-speed datacom lines reside are but restrictive subsets of where we want to create or communicate. Thus the need for mobile PCs. Mobile PCs such as notebook computers and PDAs are computers that we are willing to lug because they are worth their weight in the computing and communication applications that they enable. Mobile PCs are a fundamental need for those who have developed a rich relationship with their PC. Where we want to think, create, learn or engage in digital play, we want our PC no matter where we are.

Others have such a strong need to communicate in their digital world that they carry a mobile PC for just that purpose alone. Mobility is enabled by smallness of size, lightness of weight and long-life battery power, each of which must be balanced with the desire for a usable PC featuring a keyboard and display as nearly like that of a desktop PC, which has set our expectations, as possible. This balancing act is both the designer’s nightmare and opportunity.

Today we carry our mobile computers. With handheld personal computers (HPCs) enabled by Microsoft’s Windows CE, we will pocket them. In the not-too-distant future we may wear them as badges, jewelry, glasses or hearing aids. The multimedia and mobility megatrends are in a somewhat paradoxical relationship. The more types of data a mobile computer can handle, the bigger and heavier it must be. In this tension great mobile PCs with sound and video have been born. And, in this same tension, glaring mistakes have been made in the creation of products that were too heavy or too costly for their market window.

The connectivity and mobility megatrends are also in a paradoxical relationship. How can one be connected while moving around? From the interaction and conflicts in the relationship of these two megatrends have sprung myriad connectivity options for mobile computers. PCMCIA grew in this crack. Datacom companies have invested huge sums in wireless datacom in the belief that if we are mobile we will want to be connected even more than we do when we are stationary. Mobile datacom companies point to the cell phone market as proof that customers will pay more and put up with intermittently usable, less than universal coverage, services. They are spending billions finding that extrapolating the truths of the cell phone market to datacom may be a mistake, at least in the short term. Multimedia is also at odds with connectivity. Being connected with text and numbers is one thing, but being connected with live duplex video is quite another in bandwidth requirement and cost. There are those who believe that multimedia will bring the Internet to its knees in a prayer for infinite, free bandwidth. I am not among them. Why are we so obsessive in our desire for multimedia, connectivity and mobility? Products and services in these three computing areas have grown and are growing in ways that surpass anything that we simply “need.”

I believe that there are two fundamental human desires that are so great that we will never have enough. One is to create. Viewed from afar, the creations of humans are not in every topsy-turvy direction. Rather, we are building an external world that reflects our true internal nature, both individually and collectively. Through the external world that we are creating we are mirroring and exploring ourselves.

The other unquenchable human desire is to collaborate. To work and play together. Togetherness enables the ultimate creativity. Many limit the definition of IT-assisted collaboration to the realtime interaction of people with other people using IT environments such as telephone and videoconferencing.

Tags: , ,
January 22, 2013

Defining The Portable Concept

Just a couple of years ago, if you said, “mobile computing,” everyone knew what you were talking about: laptops. These densely packaged, battery-powered little PCs were popping up everywhere. But the rapid increase in processing speeds and chip densities, coupled with the advancement of display technology, has lowered the cost of computing power, sparking the development of whole new classes of computing and communications devices.

Today we have notebooks, subnotebooks, mini-notebooks, hand-held PCs, personal digital assistants, smart phones, and two-way pagers. But we don’t need to tell you that; you probably already have two or more of these little devices. And we don’t need to tell you which kind of device is good for particular kinds of computing and communications; you’ve probably got that down, too. Instead, this article looks at how these devices are going to evolve into their next stages–what technologies need to change and what kinds of devices you can expect to be able to purchase during the coming year.

Balancing Act

Designing a portable computer is one of the ultimate expressions of the phrase “engineering trade-offs”; when you change x, you must also change y. If you want a larger monitor, you’re going to add weight and power consumption. Conversely, if you want a 2-pound computer, you’re going to sacrifice things like battery life, monitor size, and keyboard size. Everything must balance.

Here are the 15 areas where most of the trade-offs happen:

1.  Case design, size, and weight 2.  Processing power 3.  Memory 4.  Keyboard
5.  Navigation devices 6.  Display 7.  Video circuitry 8.  Video input 9.
Storage 10. Communications and networking 11. Battery life 12. Heat
dissipation 13. Connectors 14. Additional I/O devices 15. Expandability

We can examine some portable-computing issues, including how these machines interact and where they might be headed, by designing what at least one BYTE editor considers an ideal portable computer, looking at what is (and isn’t) possible with today’s technology (see the text box “Proposing the Perfect Portable” on page 80NA 2).

As for what we can expect during the next few years, let’s look at it while performing a modified version of that hoary old party dance, the hokey-pokey.

You Put Your Data In

tecraGetting data into a computer is a necessary first step. Right now, for most purposes, that means a keyboard. Some of the ultra-thin systems coming out this year have keyboards with a very short throw: a millimeter or two. This makes a keyboard feel stiff. The ultra-narrow systems have narrow keyboards–in many cases, too narrow for touch-typing (witness Toshiba’s Libretto and most of the Windows CE hand-held PCs). IBM has experimented with its “butterfly” collapsing keyboard. All these examples have drawbacks, and no “savior keyboard” is on the horizon.

But speech recognition, already surprisingly effective, is developing fast, and some full-size laptops now come with built-in software. Micron, for example, bundles Dragon Systems’ Naturally Speaking with its notebooks. Speech will likely become much more widespread in the future, but it’s unlikely to replace the keyboard completely in the foreseeable near term–either on the road with portables or in our open-door, open-top office cubicles.

What about graphics-based input? Toshiba’s Tecra 750CDT laptop was the first to include a video camera for conferencing and scanning. This should also be a boon for those who do library research.

Smaller machines (i.e., PDAs) can get by with stylus-based input, either through tappable on-screen keyboard grids, like the T9 keyboard in Texas Instruments’ Avigo, or handwriting recognition, like the Graffiti alphabet in the 3Com PalmPilot. But this works simply because you don’t enter much data that way. According to several vendors, a stylus will be around for some time in small-screen devices, simply because fingers are too big and regular pens damage display screens. For serious data entry, such as getting 1000 addresses into a PDA, you download the information from your PC over a cable or IR link; there’s no practical alternative.

You Get Your Data Out

With a portable, output is normally the display screen; printing is rarely an issue. Displays represent a problem in terms of future development, for several reasons. First, today’s flat-panel LCD screens are already good enough, big enough, and bright enough for most users; they don’t need significant improvement. “The 14-inch TFT [thin-film transistor] display is actually significantly better than most 17-inch CRT monitors,” says Greg Munster, product marketing manager for Hewlett-Packard’s mobile-computer division, “and thus it’s all most users really need or want.” But the big screens don’t cut it in terms of price and power consumption. Ironically, the worst problem of all might be size. We all want a bigger screen, but we want the total package to be as small and light as possible.

How do you shrink a display screen without shrinking the image? There are three likely possibilities: a display whose physical size can be reduced for transport and enlarged for use (e.g., some kind of foldable LCD or mirror-based system), an image projector with a folding screen, or a tiny image that’s magnified by a lens. The first would be useful, but so far no one seems to have invented one. The second is really just speculation, because it raises even more severe power and brightness questions than the current technology.

But the third might soon be possible. A number of companies, including DisplayTech, Kopin, and Siliscape, are developing small LCD displays that you can hold up to your eye behind a lens (think one-eyed View-Master slides, and you get the idea) to see a decent color image.

At the present time, the resolution is at VGA levels and the number of colors is limited, but the potential is there (see “Mini Displays Get Sharper Focus,” September 1997 BYTE, page 24). Indeed, Rockwell incorporated such a display into its body-mounted computer, the Trekker (see “Wearable Pentium,” September 1996 BYTE). Kopin has demonstrated a display small enough to be built into a Motorola Startac, the smallest cellular phone on the market, so getting faxes on the run might someday be truly easy.

You Store It on Your Hard Drive

Disk drive technology is, for the moment, advancing faster than Microsoft’s attempts to occupy it all for Office 9x, so storage capacity isn’t much of a problem. IBM is currently supplying 8.4-GB hard drives in some ThinkPads, and more will come. Solid-state or other nonmagnetic technologies might eventually replace magnetic disks, but not soon. And high-capacity removable drives–those of the Zip/Sparq/Shark/SuperDisk/Jaz/Quest ilk–will take care of the need for moving data physically.

Digital versatile disc (DVD) drives are starting to appear as options on some full-size laptops. Apple’s Greg Joswiak says that “the availability of DVD will be important for our newest generation of PowerBook laptops, which are heavily used for graphics presentations and video-intensive applications.”

And You Bake It All About

Heat has been a constant concern of designers of full-function laptops and associated peripherals. The first 5-V Pentiums and older DRAM chips ran at shockingly high temperatures and required fans for cooling. Earlier hard disks were also serious heat producers, and we’ve had more than one PC Card modem that ran hot enough to fry itself. However, the modern versions of all these components run much cooler, and heat-control methods have also improved through the use of conductive fluids, heat pipes, innovative heat sinks, and, yes, fans.

Increases in clock speeds that would otherwise present a serious thermal challenge to laptops have been largely offset by accompanying decreases in physical sizes and operating voltages. The highest-speed mobile Pentium CPUs made with 0.25-micron technology today run at only 1.8 V internally, while memory and I/O run at 3.3 V. Intel’s Mobile Power Guidelines for 1999 target the core at 1.6 V, and memory and I/O at 2.5 V. According to Intel, average laptop power consumption (excluding displays) has doubled (from 10 to 20 W) from 1994 to 1997, and if heat dissipation isn’t addressed, it’s projected to nearly double again by 1999.

With proper management, however, the thermal load can be restricted to 23 to 25 W. For smaller portable devices, lower-powered, non-Intel CPUs, such as Hitachi’s S3 and Digital Equipment’s StrongArm, simplify heat-control issues.

Closely allied with heat is battery power, since excessive heat means wasted electricity. Therefore, heat reduction contributes to longer operating time between recharges, as well as increased battery longevity through reduced thermal stress.

Battery life (i.e., time of operation) has always been a point of contention, with users needing more than portables can deliver, and with manufacturers quoting highly optimistic specs. Today’s lithium-ion cells represent the third generation of laptop battery technology, and at the moment there’s no near-term replacement in sight that offers greater energy density in a compact package. (Zinc-air and starved-electrolyte cells have been shown for laptops, but they’re currently too bulky to build in and are suitable only for add-on battery packs.) For the business sky warrior, airliners with the new generation of computer-friendly power plugs are a welcome development that will take some of the pressure off laptop and battery designers alike.

You Do the Clickey Pokey

With GUI screens and most modern software usable on portables, you absolutely need to have a pointing/navigation device. Yes, you can certainly perform a lot of operations in Windows 95 or NT or the Mac OS using keyboard shortcuts, but you can’t do everything.

So, you need a (choose one): mouse, mouse-on-a-stick, touchpad, trackpoint, joystick, trackball, digitizing pad/pen, or touchscreen. The mouse was there first, and most desktop systems still have one, with trackballs coming in a very distant second place. The other pointing devices were all designed to concentrate fingertip screen navigation into a small, fixed space. They all work–some better than others–and we will likely see even more ingenious systems in the future.

And You Connect Yourself Around

One problem facing portable designers is the number of different connectors they must incorporate into their machines for communications and peripheral hookups. A typical full-size laptop these days can have almost two dozen I/O connectors and switches on the outside of its case. And there still have to be bays for removable drives and batteries, plus upgrade access to the memory and hard drive.

The universal serial bus (USB) standard, which can hook up to 127 different devices, is one good possibility for replacing many, though not all, of those varied connectors. Also, eliminating the number of different components would noticeably reduce the manufacturing cost of the system for both parts and labor.

For all its potential, however, USB has been slow to catch on, and it’s not clear when it will reach the critical mass of acceptance. There’s still only a trickle of USB peripherals available, and most of the things people want to plug into their portables need some other connector.

A pure USB machine is an interesting idea, but it seems to be science fiction. No portable maker we talked to, including Compaq, HP, and Toshiba, seems to be even considering such a machine. Mark Hanson, a product manager for Compaq’s Armada laptops, thinks the great number of legacy peripherals will limit USB’s acceptance. “IEEE-1394 will be more likely to replace some other connectors and will also be implemented in drive bays, although it raises some power-management issues,” he says. For better or worse, USB seems to be not a replacement but just one more connector standard–aggravating, not solving, the problem.

Tags: ,
January 2, 2013

Top Reasons For Hard Disk Failure And Then Recovery

Your hard disk drive is the main part of computer because it stores and shares the data. In the past size of hard drives was very small but these days even a personal computer has more than 500 GB of space. Data needs have increased a lot and with increased amount of space, failures in hard disk have also become more frequent. There are many reasons for a hard disk to malfunction and most of the times software failures can corrupt your hard drive. It is relatively easy to cope with software failures because you do not have to deal with internal parts of hard drive. You just need a professional that knows about software failures of hard drive and they will not only make your hard drive good again but there are 80 percent chances that you will get your lost data back as well. If your hard drive is not going through software failure and there is problem with the internal hardware of the drive then things get complex. There are very few qualified technicians that can understand the hardware failures of hard drive. HD recovery becomes difficult in case of hard ware failure.

Your drive needs help when it fails.

Your drive needs help when it fails.

The hard disk is the most important component of computer and without it your computer cannot work at all. Hard disk is responsible for all of your data storage operations as well as it hold the operating system for you. There are so many things that can happen to hard disk and when your hard disk is corrupted, your computer will not even start because the operating system will not be loaded. There are lots of different types of failures that can occur in on your disk and one of the common failures is to go through software problems on the internal media. This is a very common problem that people face with their hard disk but very few people actually try to correct this error. You should try your best to recover your data from corrupt hard disk and in most of the cases it is possible to do so. You just need a professional that understands this work. These days there are special HD recovery firms that provide guaranteed recovery services. If you have some critical and important data in your hard drive then you should hire one of these firms and make sure that your data is recovered properly.

Easy Recover

Your drive is responsible for loading operating system in a computer and if your hard drive has crashed, your computer will not even start. There is a 60 percent chance that you will lose all of your data that is stored on your hard drive if you did not handle it properly. You need to find a professional that has experience of HD recovery because there are specific tools that are used in this process. Rookie professionals with low quality recovery tools and software will not be able to help you. There are many legitimate and experienced HD recovery firms working these days and they make sure that they deal with every problem according to merit. There are software issues as well as there are hardware issues that can occur in your hard drive. Diagnosing the real problem of your hard drive is also important and there are special diagnostic tools that professional people use. Before hiring a good quality data recovery firm, you should take a look at the success rate of their work. Make sure that they have successful HD recovery cases to show you. The company must not only recover the data from corrupt hard drives but they should also try and fix your hard drive so that you can reuse it.

Cost Of HD Recovery

The hard disk is very important part of computer and especially when you use your computer for work then hard disk always has some important data. In the past people always used to have back up for their critical data but these days reliability of hard disk has increased a lot. People do not bother to create back up of their data but still there are very fair chances of your hard disk getting corrupted. It can be lot of mess when your hard disk does not work and there are very few things that you can do to recover data from your corrupt hard disk. The best option is to hire a professional HD recovery company to recover your data. They have lot of experience in this field and they have very customized tools that they use to not only recover lost data but they can also make your hard drive good again and you can reuse that hard drive. You just need to make sure that cost of HD recovery is not more than the hard disk price. If your data is more important then, you can trust an expensive method but if you do not have critical data then do not spend too much on HD recovery. Get help here.

Tags: , ,
December 12, 2012

Embedded Chips Stay Strong

The embedded market is a mammoth one for processors. According to a recent article in Forbes magazine, thirty 16- to 32-bit processors are sold into the embedded market for every one that reaches the desktop market, resulting in 1.3 billion units versus 78 million units, respectively.

chipsPCs on a chip do the same job as ZF MicroSystem’s chip, and more, but at a significantly higher cost, says ZF MicroSystems. Today, PCs for industrial applications run at about $2,000 to $3,000, with a heat and sink fan. The company says that the NetDisplay PC will cost about half that much, or less.

Why? Because the SCC uses 386 or 486 chips, rather than Pentium chips. And though the entire computer consists of some 40 components, including the design board, license bios, a DOS operating system and Windows, since it is all mounted on a single board, OEMs can save six to nine months of development, says the company. The OEMmodule provides one-piece, PC-compatible functionality in a credit-card sized package which is ready to run on power up.

Most applications the two companies are eying, they say, don’t require the processing power of a Pentium. Many are simple searches. ZF MicroSystem’s computer-on-a-chip adds just enough processing power, for example, to medical instruments like blood analyzers. Furthermore, because designers of such systems are often not also experts at designing PCs, about 65 percent of application ideas are scrapped after they hit the drawing board, says David Feldman, president of ZF MicroSystems. Many don’t want to run the risk of designing in-house, but need fast time to market. With its smaller size, the NetDisplay PC can be more easily integrated into larger systems and, if necessary, be hermetically sealed inside. Moreover, says ZF MicroSystems, the SCC eliminates the need for assorted connectors and mounting hardware usually required by single-board computers.

The NetDisplay PC comprises an FPD, which sits on a base containing the SCC, along with a mounting bracket, to integrate into OEM products. The SCC mounts to a circuit board just like a conventional IC, and is complete with integral main memory, BIOS and an OS. The subassembly includes an Ethernet port, for networking multiple units.

The 240-pin OEMmodule SCC, which handles standard motherboard functions, features a fully-integrated PC/AT, 33 or 40MHz 80386SX CPU, DRAM controller, core logic, 8- or 16-bit ISA busing and serial and parallel I/O ports. It also incorporates an ISA-PC/104 bus, floppy and IDE disk controllers, AT-compatible BIOS, an embedded version of DOS, and an internal flash memory for application and data storage.

Among systems that already incorporate the OEMmodule SCC are fare collection boxes in Buenos Aires buses, which perform a variety of other tasks including making change, logging passenger traffic and, even, monitoring the engine. Other possibilities include name search systems in office building lobbies and bedside video poker terminals in casino hotel rooms. Any application in an industry seeking information on consumer behavior is a possibility: one credit card swipe collects and tracks information about the buyer.

Tags: ,
November 13, 2012

Laptops Will Always Be King Of The Portables

Considering the tco premium most companies must pay for the care and feeding of notebook computers, you might expect Dan Barth to be looking to trade in his laptops for desktops the way sports memorabilia collectors are looking to unload their Latrell Sprewell trading cards.

But you’d be wrong.

Barth, CIO at Pinnacle Brands, a $200 million Dallas-based maker of trading cards, has shifted to buying laptops rather than desktop PCs exclusively, even though he knows that laptop total cost of ownership is higher–sometimes much higher. In fact, Meta Group Inc. estimates that because of the high cost of supporting remote users, higher purchase price and shorter life cycle, typical laptop computer capital and operational expenses are about 50% greater than for a typical desktop.

Laptops are still awesome.

Laptops are still awesome.

Still, Barth and many other IT managers continue to replace desktop PCs with laptops. Meta estimates that by the year 2000, half of all PC shipments will be laptops, up from 35 percent today. Increased business travel and telecommuting are driving much of that demand.

Given this commitment, IT managers such as Barth are looking for strategies to hold down laptop TCO. “It’s incumbent on us to find ways to manage the cost of ownership of a laptop,” Barth said. “We have to do things to reduce that cost down to where it’s marginal.”

On the road

Some companies, including Pinnacle, are turning to remote systems management tools to cut software upgrade costs and other expenses. Other companies are leasing laptops and looking to outsource laptop hardware distribution and upgrades. In addition, some are counting on standard laptop configurations and VPNs (virtual private networks) to cut down on laptop support and remote access costs.

Pinnacle, for example, has used RemoteWare remote systems management software from Atlanta-based XcelleNet Inc. for about two years. During that time, it has reduced TCO on its 50 laptops from about 80 percent to 30 percent more than that of desktops. As a result, the company has been able to increase the number of applications available to its sales force, Barth said.

RemoteWare allows IS officials to distribute software upgrades, manage remote users’ file systems and compress files as they are transmitted, reducing online time, said Cody Cleveland, IS director at Pinnacle, also based in Dallas. RemoteWare also helps Pinnacle deal with laptop users, who tend to install unauthorized applications more than desktop users do. That can drive up support costs.

“You can create a work object in RemoteWare that allows you to do a file status, so if [an application] is not part of the standard configuration any more, we can delete it,” Cleveland said. For example, Pinnacle uses a set of sales force automation databases in Lotus Development Corp.’s Notes. With RemoteWare, as a database becomes outdated it can be deleted remotely.

In the next few weeks, Pinnacle will start rolling out XcelleNet’s Remotely Possible upgrade, which will allow an administrator at corporate headquarters to see a remote user’s configuration once the user has connected to the network.

Although companies such as Pinnacle have used remote management tools to drive down laptop TCO, most large users have not followed suit. While most companies have a significant, if not a higher, number of laptops than desktops, “very few organizations are doing remote systems management,” said David Cearley, an analyst at Meta Group Inc., in Stamford, Conn.

New lease

Other large users are attacking laptop TCO by turning to leasing and outsourcing of remote support services. The New York-based Big Six accounting firm Ernst & Young LLP has been leasing its 27,000 laptops, which account for 80 percent of its PC base, for a little over two years. Leasing has allowed the company to keep up with rapid laptop hardware upgrades and reduced the expense of the machines, according to Rich Mooney, newly named senior manager of Ernst & Young’s Supply Chain consulting practice, in New York. Mooney was most recently Ernst & Young’s national director of purchasing and contracts.

“We were able to better negotiate a leasing program and obtained very competitive pricing and favorable terms and conditions,” Mooney said. Those terms included technology refresh options, on-site hardware support from Dell Computer Corp. and an “interim rent” deal that allows Ernst & Young to take delivery of new laptops up to 15 days before the lease kicks in. This will save the company millions of dollars annually, Mooney said. Altogether, Ernst & Young is leasing between 5,000 and 6,000 laptops this year and plans to replace as many as 18,000 next year.

Leasing has also allowed Ernst & Young to maintain state-of-the-art laptop technology. The company has standardized on Dell’s 166MHz Pentium with MMX Latitudes and is currently moving to 233MHz Pentium with MMX Latitudes for all new leases. “By negotiating directly with the manufacturer, we drive down the price and forecast out a year in advance what we need,” Mooney said. That lowers support costs, since Ernst & Young can standardize on replacement parts. It also reduces the need for different types of IS training and certification.

Also considering a leasing strategy is Value Village Stores, in Bellevue, Wash. However, the company is still trying to get a handle on laptop costs. Overall, in fact, IT personnel at the thrift retailer are spending 30 percent of their time managing just over 75 laptops, with the other 70 percent of their time spent on 250 workstations, said Katherine Flower, Value Village’s IT manager. The company’s base of laptops is expected to grow by about 25 percent, while the company anticipates growing from 150 stores to 200 stores in the United States and Canada by 2000.

The most costly aspects of managing laptops, Flower said, are that most users are remote and more likely to make configuration changes. “Desktop systems are locked down, and users can’t get in and change configurations,” she said. “With laptops … the machines are more open, and it creates a higher-maintenance issue.” In addition, remote users’ machines often need to be sent to headquarters to be fixed. IS receives an average of about 10 troubleshooting calls a week from remote users.

Leasing will help Value Village standardize on laptops, which often go through upgrade cycles in a matter of months. About three years ago, Value Village officials purchased Zeos International Inc. laptops. But within three months, the model wasn’t available anymore. Then, in mid-1995, Value Village switched to NEC Corp. laptops. The company used six different NEC models before switching to IBM laptops last fall.

Flower thinks three-year leases might be the answer. Leasing “pretty much guarantees we can expense out the cost on a monthly basis, then when the technology gets old and the laptops are no longer functional, we can upgrade them all at the same time,” she said.

As another strategy for reducing laptop TCO, many organizations are looking to cut back on costly dial-up communication costs. Value Village, for example, is looking into using a VPN to line up local ISPs (Internet service providers), which then can provide low-cost access to remote users.

Pinnacle also recently began using local ISPs. That strategy is decreasing the cost of long-distance phone calls and providing faster, more reliable connections. Both Cleveland and Barth expect that Pinnacle’s long-distance charges will decrease from $8,000 a month to about $600.

Simply getting laptops to remote employees is another cost that many companies are trying to reduce. Ernst & Young is looking to outsource that headache. Its technical support organization just received approval to find a service provider that will act as a “PC depot” for centralized redistribution and redeployment of all equipment in the United States, Mooney said. “This will allow us to better manage our notebook inventory,” he added. With more than 100 U.S. offices, and a 20 percent industrywide employment turnover rate per year, the company will no longer end up with as many as 120 notebooks scattered around the country when people leave.

Such strategies aren’t likely to bring laptop TCO down to desktop levels. But, say IT managers, they’re essential, since users aren’t likely to be willing to trade in their laptop convenience anytime soon.

Tags: , ,