Friday, March 16, 2007

The Virtual Call Centre

The rate at which electronics technology is improving remains amazing. Electronic products are getting smaller and smarter, and old products that never had electronics in them are getting infiltrated by electronics, or completely redefined as electronic appliances. The camera, for example. Given the dizzying rate at which old products improve and new ones emerge, the staff of the stores that sell this stuff have no chance of providing comprehensive after-sales service, as stores used to do in the good old days. Increasingly, the manufacturers of these gizmo's have to implement call centres in order to provide after-sales service to their customers. Call centres are springing up all over the place. Given the high cost of establishing them, many companies choose to set them up in second and third world countries where costs are lower. Thanks to constantly dropping communications costs, and new paradigms like Voice over IP, it's affordable to support US customers with a call centre in Ireland, India, or South Africa. And with a suitable geographic distribution, call centres can service customers any time that customers want service, at times that are also convenient for the call centre staff.

But building conventional call centres is still an expensive undertaking, even in third world countries. You have to rent a building, or buy land with a building, or build your own. You need aircon, heating and cooling, power, plumbing, parking lots. You need lots of furniture, desktop computers, and specialised telephone equipment. You need computer servers to handle and route incoming calls. You have to hire and train staff. Having made a big investment in infrastructure, you need to keep every desk manned for as many hours of the day as possible.

This model isn't all that convenient for call centre staff either. They are tied to their desks for 8 hours a day or longer. They must commute to the call centre before they can start work, and commute home afterwards. It may not be easy for them to pick up the kids after school, or to take the baby for her immunity shots, or the dog to the vet.

The Virtual Call Centre can change all of that. And the technology required to implement it is available right now. We would need some new software to make it work smoothly, but hey, that's a Simple Matter of Programming (SMOP - an acronym popular with computer salesmen).

Imagine we wanted to open a virtual call centre (VCC for short) for business. We would need only a server in a server farm, and another server in a different farm to act as a backup. The server would take incoming customer calls, perform the usual metrics, and then distribute them across the available call centre agents (CCAs for short). Oops! Where are our CCAs? We don't have a building. We advertise on the web for CCAs who can work from home. In order to act as a CCA, a person would need a desktop computer, telephone equipment of sorts, and a broadband connection to the Internet. Ideally, they should have these in their own home, but other arrangements could be made (the cottage call centre - but that's another blog). Their telephony equipment could consist of a headset with attached microphone that plugs into their desktop computer. It would link to the VCC's server through Voice over IP (VoIP). When the agent is ready to take calls, he or she would fire up on their desktop a program that they have downloaded from the VCC's website. This would allow them to log on and indicate that they're ready to take calls. The VCC's server would check what product types they're licensed to handle, and start routing calls to them. Since all calls pass through the VCC's servers, they can still be recorded, and monitored for quality (the monitoring staff could also be working from home). As soon as the call is finished, the VCC's server would know that the CCA is available for another call, and could route one through as needed.

Any time CCAs need to leave their workstations, they could click a Hold button on the VCC program on their desktop. If they need to go collect kids or run some other errand, they would terminate the VCC program and just walk away, any time they wanted to. The VCC spends nothing on accommodating staff, and can afford to have a larger number of CCAs so that between them they can take off time when they need to, or work when they want to. They would need to be paid per call handled as well as a retainer for just being online. The VCC client program could ask them a question after every minute of inactivity just to make sure that they're still there and ready to handle calls. With a beep, in case they're crusing the web while waiting. In fact the VCC could beep several candidate CCA's, and the first one to accept the call would get it, plus the bounty for handling it.

Of course to handle calls, CCAs need access to customer and product information. The VCC might handle products for many different manufacturers, and needs to deliver this product information to CCAs on demand. This problem isn't unique to VCCs, it applies to normal ones as well. CCAs could access the required product information through a browser connected to the VCC's server, which could in turn get the required information from the manufacturer's servers via SOAP or some such technology.

The cost savings that could be achieved by VCCs would be huge, but how would the VCCs enlist, train, and evaluate staff?

The VCC could invite potential staff members to enroll on their website by advertising on the Internet or through conventional media. The enrollment software could automatically check the network turnaround time and bandwidth between the enrollee's workstation and the server to see if the enrollee has a suitable broadband connection. If not, the server could ask the enrollee what location they are in, and hand the conversation off to a geographically closer server, if they have one. If the network connection isn't adequate then the enrollee could be informed of this, and invited to try again with better equipment. Even if an enrollee passes the connection test and gets approved as a CCA, each time they log on the client app would have to validate their network connection to make sure that it's adequate. Otherwise the CCA could qualify in a cyber café and then go home to their 14,400bps diallup modem

Enrollee skills assessment could be carried out over the Internet, with the enrollee browsing a multiple choice questionnaire. Language and product area skills would have to be assessed. Just to make sure that enrollees don't qualify by paying a friend to help them through this process, staff could be given short snap tests at random times whenever they are logged on, in between calls. Staff training could be handled in a similar fashion, with online web-based training interspersed with frequent online questionnaires. Staff could be shown on request what skills are in short supply so that they could plan their future training accordingly.

Quality control could be implemented as it is today in conventional call centres, by recording and optionally monitoring calls. When customers first call the VCC, they could be played a message telling them to hit the hash key if they want to break out of a call at any time - if, for example, the CCA gets abusive, or walks away in the middle of a conversation, or simply doesn't know anything about the product (which may happen if the CCA's kid sister is manning the station). Calling customers could also be invited to hit the hash key at the end of the call to give feedback on the quality of the service that they have received. Hitting the hash key would connect the customer to a voice response unit which would walk them through the various options such as providing feedback on a completed call, or complaining about the quality of service that they are receiving.

While I have used the word "staff" to describe the folk that act as CCAs, they need not be full time employees. They can be part-time folk working when it's convenient for them; mothers with a few hours available in the morning while their children are at schools, students with a few hours available in the evenings after lectures, or garrulous senior citizens like me, in between naps. Providing the workforce spans several suitable countries and continents, sufficient numbers will be available at the right times to meet the demands of customers calling in.

The Virtual Call Centre is just a specific example of a much bigger economic trend that I'm sure will start to emerge - the Virtual Office. I wrote about this shortly after the 9/11 incident. If we can get folk to stay at their homes, distributed across wide continents, instead of huddling together in vast high-rise buildings, or buses, or aircraft, we will remove many of the targets that terrorists love to hit. And at the same time liberate people from wasting what adds up to years of their lives commuting.

Saturday, March 03, 2007

Mobilizing Mobiles

Ah, the power of blog! You publish a simple idea, and back comes real gold. My thanks to eric and xiris for their comments, and for pointing out that much of what I "predicted" in my last post about mobiles replacing desktops has already happened! Nokia have already ported the Apache web server to run under the Symbian operating system which many of their phones use, and they have built "cgi scripts" that expose much of the phone's data (messages, address lists, images captured) to any browser that connects to the phone's IP address and (I guess) authenticates itself. So if you leave your phone at home by mistake, you can browse it from the office, check for messages and reply to them, check for missed calls, and so forth. Very clever! Of course this requires each phone to have a distinct IP address, and there aren't enough to go around, but IP Version 6 is gradually getting implemented and that will over time give us more than enough addresses.

Eric points out that much of what I "predicted" happening on mobiles is already available on PDAs such as the Nokia Tablets.

Xiris is concerned about the lack of storage space on mobiles. Storage size has been growing a lot faster than Moore's law, but so has our appetite for using it. Three years from now, mobiles might offer 20Gbytes of storage, and we'll be demanding terabytes. The good news is, we don't need all of our data all of the time – our attention span is simply too limited. Our mobiles' storage can operate as a level one cache, with the bulk of our files residing on a file server somewhere in cyberspace. We'll be able to search an index of all of our files, and choose the one we want. If it isn't already in our mobile, it will fetch it for us. And once the mobile's storage starts getting full, it will upload some files that we haven't used for a while and free up the space that they were using.

Worried about how long the download will take? HSUPA will soon offer better than 4Mbps downloads to our mobiles, and of that isn't fast enough, HSOPA will offer us up to 100Mbps.

Worried about who's going to store all your files and maintain an index across them, and deliver them to you when you need them, and how much it will cost? At the rate they're going, Google will probably do all this for free, unless you beat them off with a stick. Other folk will surely be happy to do it for money.

Xiris is also concerned about the lack of processor power in mobiles. They typically have a 200MHz 32 bit processor today, some have 400MHz, and they're "only" improving at the rate of Moore's law. If you want to to do serious spreadsheeting, for example, how responsive will your mobile be? And how much power would it gobble, running the battery down? Xiris also suggests the answer – to embed a utility processor in the docking station, which is a fixture and which can draw line power, and to offload some of the processing workload onto this processor while you're docked. This is a very realistic approach. The dinky screen size of the mobile is going to limit how much serious processing you do with it. If you need to crunch a serious spreadsheet, or update a large document, you will probably seek out a docking station for its big screen and keyboard if nothing else, and get the use of its processor at the same time.

So how can we offload processing onto the docking station? Let's get back to the way in which we deliver the mobile's applications to the docking station. I suggested that we should web-enable them, so that the user can use a browser on the docking station to view and drive the mobile's applications. But think about it – do we really want to code every mobile application twice, once to run on the mobile itself and a second time to run in a browser in a docking station? Life is too short, programmers too scarce (I know, I am one, and I can't get around to programming all the stuff that I want for myself). So let's assume that the mobile's applications are delivered in the same way to either its own screen or to a docking station – as a web app. Most modern mobiles have browsers built into them. Many only support the WAP or WML subsets of HTML, which doesn't include JavaScript, but a fair number of mobile browsers can handle JavaScript, and this number is increasing as the amount of memory available for phone apps grows.

So let's assume that the mobile of the future delivers its applications to a browser in much the same way that Google delivers its Docs and Spreadsheets applications, whether to a docking station or to its own screen. A lot of the processing now takes place in the browser instead of on the web server, courtesy of some clever JavaScript code. And the JavaScript code in Google's spreadsheet web app isn't nearly as smart as it could get. Currently, it sends every cell that you change to the web server via AJAX so that the server can decide how to format the cell's content, and update any formulae that might depend on that cell's value. Most times we change a cell, it's to write in a simple number or character string. It would not be hard to write JavaScript to check whether this is the case, and to format the cell accordingly. Nor would it be hard to check whether any formula includes the changed cell in its scope; if not, there's no need to recalculate the formula. Nor would it be hard to check if the user had previously specified a particular format for the cell that has changed, or a range of cells that includes the changed cell, and to apply the appropriate format. Nor would it be very hard to implement many of the functions that spreadsheets offer, and especially the ones most often used, in JavaScript code, so that any changes to the cells in their scope could be calculated in JavaScript in the browser. Only the way-out and weird cases would have to be sent up to the server, and that would be rare. The browser would handle the regular stuff on its own, offloading the server.

I have had a look at the HTML page that implements Google spreadsheets. It includes about 380KB of JavaScript code, most of which has had the comments and white space stripped out. When JavaScript is downloaded to a browser, it will be zip-compressed if both web server and browser support zip, which most do. That would bring the JavaScript download down to about 125KB. On a dedicated 1Mbps link, this would take about a second. That's about as long as it takes to open a spreadsheet package on my current laptop. And almost all browsers cache the files that they download, so if you get into the spreadsheet application several times in a day, your browser would only need to download the JavaScript code once – unless it ran short of buffer space and needed to prune its buffer.

Likewise when users of http://blogger.com like me :) use its posting page to write up and format a new post, clever JavaScript code in the posting page handles all our text input, editing, and formatting instructions in the browser, only involving the server once we have completed the document preparation. I have just had a look at the posting page. It includes about 23 JavaScript files and has a small amount of JavaScript embedded within it. In total, it uses less than 234KB of JavaScript code to implement what is a pretty handy document composition and editing facility. Most of the JavaScript files still have their comments and white space intact, for readability. There are utilities that can strip out comments and white space and spit out just the code that does the useful work. With zip-compression, the JavaScript download would be less than 60KB. On a dedicated 1Mbps link, this would take half a second. That' a lot quicker than opening your average word processor. And JavaScript file buffering wold help here too.

In short, we could offload most of the spreadsheet and document processing overheads onto the browser, reducing the load on the web server and the network at the same time. If the web server is running on your mobile and the spreadsheet or document processing is taking place in a browser on a docking station, then the processor in the docking station will be doing most of the work. So you wouldn't need a very fast processor in your mobile, nor would you burn up its battery while you do these tasks.

On the other hand, if you're on the road and only have your mobile to access one of the documents or spreadsheets on it and maybe make a few changes, then the browser and JavaScript will run on your mobile. Your processing rates will not be great, and you will burn your battery while you're busy, but at least it's possible. The constraints of your mobile's screen and keyboard will probably keep this mode of operation to a minimum, but it would be nice to know that when you really need to do it, you can.

The JavaScript language was developed originally by one man, Brendan Eich, rather than a committee. It was loosely based on the Java computer language. It's a much smaller, simpler language, and is not gaining features and fat as quickly as is Java, which has locked horns with .NET's C# in a sequel to the browser wars movie. Folk who know JavaScript well find it really elegant and powerful. It's not the language that anyone would make their first choice for coding a spreadsheet or document processor, but it's there in almost all browsers. As Woody Allen once said, 80 percent of success is just showing up, and JavaScript is showing up much more than any other in-browser language. Its biggest current drawback is that it's an interpretive language, and runs slower than a compiled language. But then, Java and C# are also inherently interpretive languages. Maybe it's time the open source community started working on a Just In Time compiler for JavaScript. Because of the small size of the language, it wouldn't be very hard to do.

By the way, I don't hold any stock in Google, but after researching for this post ... hmm.