This is the second part in the ‘Website Monitoring 101’ series in which we are going to move away from the basics of ‘website monitoring’ to start looking at how you utilize this.
A Website that is frequently inaccessible is likely to destroy customer loyalty and lose business. Ensuring that all of the elements of a Website are functioning properly is critical to maximizing your company’s Web investment. A good vendor offers several advantages:
- 24×7 monitoring of all key areas of Website and Web applications
- Quick and accurate notification of problem when it occurs notification
- Web-based real-time reporting of historical data
- Easy setup and immediate results, with no software or hardware to maintain
- Multiple Internet location monitoring for a holistic view of end-to-end connectivity for geographically distributed users
- An accurate view from the end-user perspective
And just in case you’re wondering what are the features of a good website monitoring services to stand as a big player in the industry, I will start to explain in the next section so that you have a better understanding of what is necessary to work with website monitoring services profitably.
In this part, we have moved beyond looking at the ‘advantages’ widely known aspects of successful website monitoring.
1. Load Testing
A study from Comscore revealed that 4 out of 5 of the US’s ~86 million smartphone users accessed retail content on their smartphone during July 2012.
Many smartphone users check pricing and deals on their phones even while visiting a physical store. If your site is fast and responsive, you could win new customers or solidify relationships with current users. While mobile shoppers love apps, mobile web enables them to search new sites quickly and efficiently during sales.
Assuming you have a mobile-friendly site, here are few questions to ask to ensure a smooth holiday shopping experience:
1. What is weighing down your site?
2. Are you taking advantage of holiday opportunities?
3. Do you load test and monitor the site regularly?
4. The final question remains: Is your mobile site ready for the holidays?
As we’ve mentioned here before, adopting cloud computing strategies can generate transformative advantages for IT organizations, but not without important considerations. Reducing cost and improving user experience can be achieved by moving applications and infrastructure to the cloud.
So how do CIOs get started, and more importantly, enforce and improve the quality of service they deliver to the business in a cloud paradigm? Vik Chaudhary recently spoke with the editors at CIO Insight on how companies can take advantage of the cloud with three straightforward recommendations.View The Video
We’ve heard a lot recently about the importance of speed and performance when it comes to online retail. The New York Times highlighted research from Microsoft claiming that 250 milliseconds—a mere eye blink—could make the difference between a repeat visitor and a lost customer. And a popular infographic touts that Amazon would stand to lose $1.6 billion in sales per year from a 1 second web page delay. Our friends at Walmart.com have also shared some awesome research linking web performance to conversion.
These statistics are welcome news for the web performance community. But sometimes they don’t apply. With Apple, a lot of rules don’t apply.
This past weekend, Apple sold a record 3 million new iPad 3 tablets. That’s pretty phenomenal. Yet, it came on the back of a pretty bad outage only 10 days before.Apple-store-scatter
On March 7, Apple announced the new iPad 3. For effectively the entire day, the Apple Store was unavailable. That meant no one could check out the new iPad, nor purchase iPhones, MacBooks or anything else.
To Apple’s credit, the Apple store normally runs very quickly—averaging well less than 2 seconds for total User Experience Time and less a second for Time to First Paint. (The Apple Store is a member of the Keynote Retail Performance Index, measured with Keynote Transaction Perspective.)
We’ve written previously about the concept of tenacity. A website visitor’s tolerance for errors, or delays, is a major factor when balancing the cost and benefit of building capacity and engineering performance into Web applications. While Apple’s fanatic customer base is an extreme, it illustrates the point that there’s a continuum of performance expectations for users.
Apple-store-trend Your product/service is unique. And your customers are also unique. Keynote web load testing consultants dig into web analytics to model user behavior. They consider familiarity, tenacity, interaction speed and connection speed when developing virtual user profiles. It may be unrealistic for you to understand how different levels of performance impact your various customer types across all these variables. But if you can begin to understand them, you’ll be in a better position for setting ongoing performance goals and SLAs—especially around tolerances for outliers from your averages.
Unfortunately, this is where many site owners stop. The site goes into production, and from behind the firewall, everything appears to be snappy. There’s no reason users should be anything short of delighted. But unless the site is monitored out on the Web with all third-party content being fed, using a real browser just as a user would, there’s no way to tell that everything is working and that pages are loading in an acceptable timeframe.
Business people love data, and as indicated earlier, the use of tracking tags on Web pages has simply exploded. Not surprisingly, more tags equals more performance management challenges, particularly since multiple vendors are usually involved, presenting multiple opportunities for glitches.
Multi-sourced content is here to stay — businesses need it, and users want the end result. Content can be tamed and made to perform well by consistently, continuously following these best practices:
- Be sure your site is architected properly so that third-party content will have minimal impact on page load times — four seconds is the magic number, beyond with users abandon your site in droves.
- Scrutinize every third-party component to be sure it’s absolutely necessary; pare down the elements to only those needed to satisfy your site’s business and revenue objectives.
- Monitor the performance of each page component continuously, from the field with real browsers just as users would experience your site; when web performance issues come up, invoke your SLAs, negotiate a fix with the vendor, or lose the problem component.
- Practice good site hygiene — clean up unused tags on a regular basis.
Operations teams have long used this information to tune their websites and correct web performance issues. But collaborating with developers on problems impacting user experience was more difficult.
Now operations teams can monitor website, measure, and parse page performance in a way that offers a far more telling picture of user experience–information that’s both actionable to developers, and of concern to business owners.
To give business owners this insight, Keynote in Transaction Perspective 11 leverages Navigation Timing to measure distinct phases of User Experience:
Time to First Paint: When the user sees something happening on the screen; the site has begun to render in response to their request. This critical first step tells the user that the site is responding to their action.
Time to Full Screen: When most users would perceive their browser space is filled above the fold; rendering may still be happening out of sight, but from the user’s perspective, they’re looking at a full page.
User Experience Time: The total elapsed time the page took to complete. The browser is done with the page and is now waiting for and responding to additional user input. This is analogous to the standard page load time or user time; it can also be used to measure a complete multi-page transaction.
Site owners are more pressured than ever to deliver the fast, flawless experiences users now demand, and can often find at a competitor’s site. Monitoring and measuring their web performance is no longer the simple task of measuring overall page load time. There’s really nothing a webmaster can do with the information that the site is running slow. Is it their own content? The CDN that’s pushing out their videos? The sister site that’s hosting their image library? The Flash banner promoting upcoming programming on their TV network? Or the ad network servers that supply the bulk of the site’s revenue? Web load testing can help but, How does the site owner identify the bottlenecks, and gain actionable data to demand better performance from weak providers in the content chain?
New technology has made it almost as easy to shoot, edit and post a video online as to prepare a written story with accompanying photos. Online media sites, with help from YouTube, have enabled a mass Web audience that prefers to watch rather than read.
There’s also no faster way to lose an audience than with a video stream that stutters and constantly stops to rebuffer. But again, monitoring streams from multiple servers or domains, and understanding actual end-user performance, is a significant test and measurement challenge.
In addition to evaluating customer experience on a subjective level, the Keynote research assessed seven factors related to the site’s service levels:
- High-Speed Response
- Dial-up Response
- Response Time Consistency
- Geographic Uniformity
- Load Handling
- Outage Hours
Three Bottlenecks That Block Traffic
Too many technical elements on a page: From small non-visual images to java scripts to unnecessary encryption, too many individual elements on a page can stifle performance
Java overload: The ubiquitous coding language is an important tool for developers, but every Java script can act like a tiny speed bump for browsers.
Proliferation of third-party tags: The rising number of third-party tags – DoubleClick ads and calls to third-party analytics services – can also hinder performance.
Mobile internet is another high-potential area for this market. Although the ability to book a car wirelessly has been around since about 2000, it’s still very much gaining traction. Mobile, of course, presents a whole new set of challenges for rental companies in terms of maintaining a positive user experience. Operating on a tiny screen will demand even more technical efficiency and optimization. But the opportunity is a great one, particularly for capturing busy business travelers on the go.
One is that users are much less tenacious, much less tolerant of poor performance. Five, six years ago there was still the sense of novelty. Today, though, they’re using the Internet for very critical things, critical utility, be that trading stocks or looking at bank accounts or making purchases.
Delivering complex functionality in a manner that satisfies high user expectations requires a tremendous infrastructure, which in itself exponentially multiplies the opportunity for slow performance or outright failures. To deliver a customized “my” page on a site such as Yahoo! or Google or one of the news portals, for example, may require hundreds of servers. And running a search-and-transaction site such as eBay takes a huge amount of processing horsepower.
Whether the objective is to reduce abandonment rates, to increase self service and reduce call center loads (and costs), to increase average sales or repeat purchases, performance monitoring is critical to acquiring the data needed to formulate sound Web strategies and tactics. Web performance is the common denominator underneath every Web site metric and is fundamental to achieving any Web site goal.
Things can and do go wrong at any step of the way — in the site’s own internal network, over the Internet backbone, across the last mile of the local ISP, or on the user’s desktop. Site operators employ a number of strategies to monitor this complex path and pinpoint the many problems that inevitably come up.
The leveraging of personal data, both for marketing and personalization, is arguably a major factor in the evolution of the online experience as we know it today. But the exploding scale of its collection and use – and increasing consumer awareness of the practice – has launched a debate about how best to collect and use data to enhance the Web experience with performance while protecting the privacy rights of consumers.
Scores of companies are in the online data business. Some collect data from individuals’ Web browsers and offline sources like auto registration and real estate records. Others aggregate and analyze the data – sometimes down to semantic analysis of what a user writes in comments or social media updates. Still others package it and offer it for auction on online exchanges to still other companies that place online ads, or to websites themselves, who use it to tailor content, offers, and even pricing based on the profile of the person sitting on the other side of the browser.
The moment a consumer puts an item in a shopping cart, makes a bid on an auction site, or takes any number of innocuous actions, that information is put up for sale – virtually instantly – often for just fractions of a penny. (It adds up, quickly, though; targeted advertising commands a 100+ percent premium over non-targeted ads.