Word of the Day – Page Parking and Parallel Browsing (Web Browser Tab Usage)

From Jakob Nielsen:

Summary: Browser tabs separate the stages of collection and comparing and serve as memory aids to keep many alternate pages available for consideration as users are shopping or researching. Follow 7 UX guidelines to better support this user behavior, which is particularly common among younger users.

How do people use the tabs in modern browsers? The ability to keep multiple pages open at the same time in different tabs can be used for parallel browsing, where a user alternates between tasks and resurfaces a tab when it’s time to work on the task in that tab. For example, a user might keep Facebook open all day in a tab that’s checked for updates from time to time.

Our recent user studies for the course Designing for Millennials found that young adult users engage in another tab-related behavior, which we call page parking: opening multiple pages in rapid-fire succession as a way to save the items on those pages and revisit them at a later stage. This behavior often occurs when shopping, researching, or reading news, but can happen in any task where it’s useful to open several similar items, each in a separate tab. Later, after users review the content in the tabs, they may cross off many of the parked items and close the corresponding tabs.

Previous WOTDJenny Haniver

Printer. My old foe. We meet again.

“Printer not found on network” is the new PC LOAD LETTER ERROR.

Are you old enough to remember Simon, the electronic memory game?

Are you old enough to remember Simon, the electronic memory game? It turns out it’s older than you and me.

screenshot-www.amazon.com 2015-08-14 23-51-15

Future Nostalgia for Today’s Middle School Kids

I needed to print something

I needed to print something. I fought the printer for 10 minutes, doffed my hat to the victorious output device, then quit the field of battle.

iThemes Security WordPress plugin has a great new feature

WordPress installations often get broken into by brute force guessing of the password for the “admin” account. Changing the administrator account to something other than “admin” is the single best thing you can do to improve WordPress security.

My favorite WordPress security plugin is iThemes Security (formerly WP Security). It has every security feature you can think of in one plugin, and is available as a free plugin or a paid version with more features. I just noticed that it has a new feature. It can automatically blacklist IP addresses that try to log in using the admin username.

  1. Install iThemes Security plugin. In the WordPress administration panel, click on Security. It will be on the left side near the bottom.
  2. Before making changes, make a backup of your database on the off change something goes wrong. Click the Backup tab. Click the Create Database Backup button. While you’re in the tab, it’s a good idea to schedule automatic database backups.
  3. Click the Advanced tab. Change the administrator name to something other than admin.
  4. Click the Settings tab. Under Brute Force Protection, check the box for “Immediately ban a host that attempts to login using the “admin” username.”
  5. Click the Save All Changes button.

That will stop 99% of bogus login attempts.

Immediately ban a host that attempts to login using the "admin" username.

Another WordPress plugin I like is Captcha (free and paid versions available). It protects the login page and comments from bots by asking the user to answer a simple math problem.

Court: Not infringement to use competitor’s trademark in Google AdWords

Harvard Journal of Law and TechnologyUsing a Competitor’s Trademark as a Keyword for AdWords is Not Trademark Infringement:

The United States Court of Appeals for the Tenth Circuit held that the use of a competitor’s trademark as a keyword that activates sponsored links in Google’s search engine is not trademark infringement. 1-800 Contacts, Inc. v. Lens.com, Inc., No. 11-4114, -4204, -4022 (10th Cir. July 16, 2013). The court affirmed the lower court’s summary judgment to defendant Lens.com with respect to 1-800 Contacts’ claim that Lens.com was directly liable for misdirecting customers to click on links to Lens.com after searching for the phrase “1-800 Contacts.” Id. at 4.

That’s what a court found, which isn’t to say that Google has to have the same rules. After this 2013 court ruling, Google changed their policy. Now they allow trademarked terms for keywords. They still block at least some trademarked terms from being used in the text of the ads depending on the usage, and trademark holders can file a complaint.

My Million Dollar iPhone Idea

I figured out a way to make a million dollars. I’m going to make a smartphone that’s exactly like the iPhone in every way except it vibrates loud enough you don’t miss half of your calls.

Did someone just try to hijack the lesjones.com domain?

I got this email from GoDaddy saying that the transfer of lesjones.com to GoDaddy failed. I didn’t try to transfer my domain. So what happened – did someone try to hijack the domain? Read more of this post

Run PHP code in a WordPress widget

I had an advertiser who wanted his link to appear in the right sidebar of the site, but only on the home page of site. Sidebar widgets usually appear throughout the site, so I had to figure out how to make it work.

I knew WordPress supported an is_home() conditional statement, but PHP code can only be executed in themes and plugins, not in free text/HTML. I tried inserting the code into the functions.php code, but never got it to work exactly right.

It turns out there’s a WordPress plugin called PHP Code Widget that lets you execute PHP code inside a widget. Just type it in along with your text and HTML and it works. Here’s the code:

<?php if( is_home() ) : ?>
Text and HTML go here and will appear only on the home page.
<?php endif;?>

Cloudflare Update Week 2

Last week saw a 17% increase in page speed compared to the control period before using Cloudflare. I had a feeling I wasn’t seeing the full speed boost. Because lesjones.com has so many pages and images, it seemed likely that Cloudflare hadn’t seen them yet, so it hadn’t cached them.

To solve that problem I ran a linkchecker that accessed every linked page and embedded image on the site. That seems to have helped. Compared to the control period, performance has improved 34%. For my site that’s an average load time that’s about 2.3 seconds faster. Not bad at all considering it took almost no effort.


Cloudflare Results in Google Analytics

One thing that concerns me is that the page load sample for last week was extremely small – just seven pages compared to 24 for the control period. For various reasons Google Analytics can’t* and doesn’t** collect page timings for all pages.


Both of those page load samples are very small as a percentage of traffic – just 0.45 for last week. At work the number is about 16%. I’m really not sure why the sample here is so small, but I’m going to continue the test for another week to be sure the results aren’t a fluke.

* From Google Analytics Help: “Site speed tracking occurs only for visits from those browsers that support the HTML5 Navigation Timing interface or have the Google Toolbar installed. Typically this includes: Chrome, Firefox 7 and above, Internet Explorer 9 and above, Android 4.0 browser and above, as well as earlier versions of Internet Explorer with the Google Toolbar installed.”

** That same help section says that “By default, a fixed 1% sampling of your site visitors make up the data pool from which the page timing metrics are derived.” However, that doesn’t match the numbers on this site or the one I manage at work.

Cloudflare One Week Update

Last week I routed the lesjones.com domain through Cloudflare. Their service serves as a content delivery network (CDN). They have locations around the world and can deliver files faster than my one location.

When a browser requests a file, Cloudflare fetch the file from my site and caches it. The next time the file is requested it comes from Cloudflare instead of my site. Their servers and Internet connectivity are faster than mine, so visitors get the file faster. The caching also reduces the number of requests my website has to handle, which should increase the speed of the website for other tasks.


One week later, total page transfer time (the page plus the images) across the entire site is a little faster, about 17%. Roughly the entire 1 second load time speed up looks to be the average server response time.

Question is, is the change in page load time a fluke not related to the CDN/Cloudflare change? Or did relieving my website from serving all of those files free it up to serve out pages faster? It’s hard to say. I’m going to keep running Cloudflare for a few weeks to get a better idea.

Cache misses

I’m going to try something to see if I can improve the performance. I’ve got a big site with a lot of posts and images. Many of them may not have been requested yet. Cloudflare probably has just a fraction of the images cached. When those files are requested, Cloudflare doesn’t have them (what’s called a cache miss) meaning they’re still having to download them from the website for the first time.

To get all of the files  in the cache I’m running a link check this morning. The linkchecker will access all of the links and embedded images on the site, which will prime the cache.

Turning on Cloudflare

Speed is good – there are plenty of studies linking server speed and sales conversions on e-commerce sites. I’ve set a goal of getting our average page load time below three seconds as measured by Google Analytics.

One way to speed up a website is to use a Content Delivery Network (CDN). CDNs use a number of techniques to speed up content delivery.

CDNs have locations around the world, hence the network part of the name. When a file is requested the CDN determines the quickest path to get the file to the user, based on their location and network path. CDNs also tend to be lightning fast in terms of DNS response, redirect time, server response time, and Internet connection speed.

I’ve been making baby steps towards using a CDN at the e-commerce website I manage. With just some small changes the speed difference has been tremendous. The CDN is delivering static files like images, JavaScripts, and CSS files two, three, four times faster than we can. Besides delivering files faster, the CDN is offloading those requests from our server and Internet connection to theirs. That should make the server respond better to the requests it’s still handling, such as dynamic pages.

Right now we’re just using the CDN to serve out header and footer template images – things like the logo, navigation buttons, and CSS and Javascripts that are common to most pages. To really take advantage of the CDN we need to use it to serve product images, of which we have many thousands and which account for the majority of our downloads. The challenge is that as new products are created and new product images are uploaded, we’d need some way to make sure the new images are synced with the CDN.

One way to sync them is to select a CDN that supports caching reverse proxy. We upload the image. Our e-commerce system will put it in the right directories on our servers, such as/images/product/image.jpg. We change our templates to call the images from the CDN instead of our server using the same directory path, such as http://www.cdn.com/images/product/image.jpg. The first time the file is requested from the CDN it will realize it doesn’t have it, then look for the file on our website in the same directory. From then on the CDN caches the file and serves the file from its own servers.

Another way to move all of the images to the CDN is CloudFlare. Instead of changing URLs and uploading files, you just change your domain records to point your domain’s IP address to CloudFlare. All requests for anything on your site goes through CloudFlare. They cache the content that comes from your site and then serve it out over their CDN the next time it’s requested. All of your URLs stay exactly the same. The basic plan is free. Paid plans add some interesting features, like DDoS protection, additional speed enhancements, and mobile optimization.

Read more of this post

Something Evil About Robots.txt I Didn’t Know

Quick background: A robots.txt file on your website will tell search engines and other bots that obey the robot exclusion standard what files and folders they can and can’t index, or whether they can access the website at all.

I’ve been working on the robots.txt file at work the last few days.* Once the file had the bots I wanted to exclude I decided to run it through a robots.txt validator.

Boy did I learn a few things. It turns out that you should put robot exclusions at the top and directory and file exclusions below. There were also a few minor formatting issues that I’m not sure really mattered.

There was one, however, that was a shock. Let’s say you’ve got a folder called “video”. There’s a huge difference between these two disallow statements:

Disallow: /video/
Disallow: /video

The first example with a trailing slash tells robots not to index anything in the video directory. So far so good. The second example without a trailing slash tells robots not to index anything in the video directory, or any file at the root level with video at the beginning of the filename.

Without the trailing slash, you would exclude /video.html, videoplayer.aspx – you name it. Anything at the same level of the directory structure that begins with video. You can get into trouble in a hurry if you leave the backslash off of the disallow directive.

* What prompted the work was all of the bots that kept showing up in our error files. One of the worst? The Internet Archive Bot that collects pages for the Internet Archive. It would generate hundreds of errors a day. When I looked around at bot ban lists the IA bot showed up over and over. You’d think Internet Archive would  have worked the bugs out of their bot by now.


I’m all the time needing to paste text sans formatting. I’ve been using tricks like pasting into the browser’s search box and then copying and pasting that, or pasting into Notepad to clear the formatting.

Turns out you can use Control-Shift-V to paste text without formatting. I probably should have known that a long time ago, but better late than never.

Discovered (where else?) in a Cracked article. Their motto should be “Cracked – It’s like Wikipedia, but with even more penis jokes.”