Web has become a major front in this decade’s business wars, and even the most traditional companies must often battle competitors online. E-businesses are investing heavily in the technology required to provide new functionality and services that will attract today’s Internet savvy customers.
At the same time, companies want to control IT costs and maximize return on investment.
Hence, they are also investing in service level management(SLM), a set of monitoring and managing activities that ensures customers consistently have the best possible online experience and that minimizes the costs required to sustain that experience. Of course, one element of SLM is internal monitoring of hardware and applications. But more and more companies now also acknowledge the need to monitor web site performance from an end-user perspective. One way to do this is with owner-operated software designed to access a Web site from beyond the firewall and take measurements that reflect a customer’s experience.
This paper discusses the deficiencies of the owner-operated software approach, and argues that the best way overcome these deficiencies is by means of neutral, third-party performance monitoring. Specifically, neutral, third-party services-based performance monitoring.
Private agents continuously monitor and detect problems, giving you uninterrupted information, even when a new application is deployed or when changes are made to an existing Web application.
A Web site that experiences even subtle error rates internally can never be expected to perform well on the Internet. When you deploy a Private Agent in your corporate data center, any error rates and download times recorded will help you quickly detect, diagnose and resolve problems before wide scale deployment.
Having a feature rich website and maintaining a robust IT infrastructure with an efficient load testing strategy would be great for your online business. A feature rich website having customer friendly features such as adobe flash components, video streams, roll over details and product reviews are likely added from your third party providers. With these additional features and capabilities, the performance of your online business might have to be compromised sometimes.
These modern composite websites created for online businesses with decision making features ensures customer interaction. If the site is low on performance or if a page doesn’t load, your customers would not be blaming any behind the scenes third party providers. You will still get the blame and soon your customers will rush to your competitors.
There is always room for customer retention with preventive measures. Quality control must be the main focus with understanding and managing your composite website. A detailed web performance analysis, transaction monitoring and measurement of your site with all the added components will be a good start. A specialized web tool to monitor your sites performance is very critical. Analysis should be in terms of real browser, as in how your customers experience your site. With this analysis, you will be able to obtain critical information that will help improve your site and your online performance. This web performance data will definitely provide you with insights for customer retention.
It has been some years now since the dot-com bubble burst, and substantial companies with coherent business plans have gradually replaced the wacky and wishful to create a thriving online business environment. This was inevitable given the basic promise of the Internet, the advent of broadband, and the remarkable innovations in information technology. Executives of many “click-and-mortar” companies now recognize that Web-based sales and services are almost as essential to their success as they are to the success of businesses that are conducted entirely online.
Surveys show that IT spending is on the rise, especially for Web applications, because enhanced online service can have a direct and significant impact on customer satisfaction. For example, CIO magazine reports that improved online service is a key IT objective for 72 percent of the companies it recently surveyed, all from the Fortune 1000.1 In the same survey, 49 percent of the respondents support enhancing or creating new IT programs to generate more e-business revenue. Meanwhile, chief information officers (CIOs) want to keep tight controls on IT budgets. Not surprisingly, the CIO magazine survey indicates that 44 percent of the respondents plan to reduce IT operating costs while 38 percent want to improve employee productivity. The days of irrational exuberance are over, and prudent executives are left with conflicting impulses to spend and save.
How can an e-business use IT to enhance online web performance and increase revenue while also controlling costs? One obvious strategy is to focus internally on the systems that deliver content and functionality. To this end, most organizations originally invested in tools to monitor the hardware components of their IT infrastructure. These tools worked fine in a contained mainframe environment where functioning hardware was a reliable indication of customer experience. However, as companies moved to distributed servers running Web applications, it became important to also monitor website application software characteristics, e.g., the length of time to run a database query or load a Java servlet. The idea is to use these so-called internal systems management tools to detect problems quickly from within the firewall and respond efficiently, thereby diminishing the external impact on service quality and customer satisfaction. While efficient systems management is important, many IT professionals have recently recognized that it does not always capture customer issues. There are a variety of reasons for this, but the basic deficiency is that speculation about customer service is inferred indirectly from measurements of internal components and applications. The systems management tools say nothing directly about the end-user experience, the ultimate determinate of customer satisfaction. In response, many organizations have purchased software to supplement systems management with performance monitoring,
which is conducted outside the firewall from the customer’s perspective. Essentially, the software runs on computer proxies placed in different geographic locations. A proxy—also called an agent— initiates scripted sessions with an e-business site and emulates a real customer by making requests, conducting transactions, and so forth. An agent takes quantitative measurements of the Web site’s response to its actions, and these measurements are intended to reflect the customer experience. The major difference between systems management and performance monitoring is that the former is conducted inside the company’s firewall and the latter is conducted from outside the firewall.
However, as described, the methods share a basic feature: both are performed by IT staff with owner-operated monitoring software. In other words, an e-business does all the monitoring by itself.
Although e-businesses need to supplement the internal perspective of systems management, website performance monitoring can’t do the job effectively if it is conducted with owner-operated tools. Given the realities of the Internet and the pressures of a competitive marketplace, website performance must be monitored by a neutral third party with global resources and a comprehensive, well-tested, credible methodology. This is precisely the service Keynote so effectively provides to help e-businesses improve key Web application service levels, maximize revenue, and effectively manage costs.