Last updated on 7th January 2021 by Rob Carter
Website errors to avoid
I’ve been in the web design and development business since 1996, and over the past 24 years I’ve seen a lot of really bad websites.
From large ecommerce websites with missing payment buttons to local business websites with broken contact forms, I’ve encountered just about every type of online nightmare you could possibly imagine.
I must have audited well over 1,000 websites during my career, and I’m still seeing the same mistakes that website owners, web designers and digital marketers have been making since the days when the internet was filled to the brim with flashing text, neon colours and pixelated graphics.
Only last week I was asked to audit a website for a prospective SEO client, and I soon discovered that search engines were being prevented from crawling and indexing it. This meant that nobody could find the site on Google, Bing or any other major search engine. Needless to say, the website owner wasn’t a happy bunny.
In fact, it was that audit that prompted me to write this post.
Rarely a day goes by when I don’t see the same schoolboy errors being made by people who build, manage and promote websites, and I want to help you avoid making the same mistakes they’re making.
I’ve put together a list of 50 common website mistakes to avoid, organised by discipline. I’ve only included mistakes that are applicable to every type of website, so things like broken checkout pages have been omitted.
I can absolutely guarantee that within the next month I’ll have a conversation with a website owner who doesn’t have a single backup of their website. And I can also guarantee that at some point this year I’ll sign into a client’s admin dashboard using the password “CompanyName123”.
These are the kind of mistakes that no company should ever make – and yet they do. Every. Single. Day.
Don’t be one of them. Run through the list of common mistakes below and make sure you’re not making any of them.
You can thank me later.
1. Weak passwords
Since 2011, internet security firm SplashData has produced an annual list of the 25 most common passwords based on millions of passwords leaked in data breaches.
Each year for the last seven years, the password “123456” has been used more often than any other password. And if that’s not bad enough, “123456” made up 4% of all passwords surveyed!
SplashData’s “Worst Passwords List for 2019” has just been released. Their research shows that almost 10% of people have used at least one of the 25 worst passwords on this year’s list, and almost 3% of people have used “123456” as a password. You literally couldn’t make this shit up.
If you’re using any of the passwords on that list – or any weak password for that matter – stop reading this article right now and change it. Use a mix of characters. Avoid common substitutions. Don’t use sequential letters and numbers and, above all, make it long.
Your security is only as strong as your password. So if yours is easy to guess, change it. Right now.
2. No website backups
I’m not exaggerating when I say that at least once a month I talk to a business owner who doesn’t have a single backup of their website. Even after all these years, it still leaves me speechless.
No matter what you do to secure your website, the risk will never be zero. If your website is hacked, your database becomes corrupted or a rogue plugin breaks the functionality of your site, you’ll need a way to restore it before disaster struck.
Imagine investing tens of thousands of pounds and hundreds of man hours into the development of your website, only for it to disappear overnight, with no chance of restoring it. How would you recover customer data? How would you process pending orders?
If you have no backups of your website, you have a problem. And don’t think your web host will save your bacon. Hosting companies will do their utmost to convince you that your site is safe with them; in reality, most hosts will deny any responsibility for your backups.
Don’t believe me? Here’s a few little nuggets from some of the largest hosting companies around, taken directly from their legal agreements (correct as of July 2020).
You shall be solely responsible for undertaking measures to: (1) prevent any loss or damage to your website or server content; (2) maintain independent archival and backup copies of your website or server content; and (3) ensure the security, confidentiality and integrity of all your website or server content transmitted through or stored on our servers.
Bluehost assumes no responsibility for failed backups, lost data, or data integrity.
It is the sole responsibility of the customer to make and keep copies of their data, and Bluehost will not attempt to keep data (including backups) of accounts that have been expired or deleted.
You agree to take full responsibility for all files and data transferred and to maintain all appropriate backup of files and data stored on HostGator’s servers.
You acknowledge and agree that it is your responsibility to regularly backup all your Content in order to prevent potential data loss.
Although we may perform regular backups of your site and Customer Content (as described in the Order), we do not guarantee there will be no loss or corruption of data.
You agree to maintain a complete and accurate copy of any Customer Content in a location independent of the Services.
Some high-end hosting plans do include a proper backup service, but these are generally out of the price range of many small business owners.
These hosting companies aren’t being deliberately devious and dishonest. Hosting is a business after all, and these companies need to protect themselves and their interests.
The bottom line is it’s your business. It’s your website. It’s your responsibility!
If you don’t have any copies of your website, stop reading this post and make a backup of your site right now. Don’t worry if you’re not sure how; I’ll be creating a full tutorial next month.
3. Outdated applications, themes and plugins
Of all the common website mistakes in this list, using outdated applications, themes and plugins is probably the one I encounter the most. Unfortunately it’s one of the most serious mistakes you can make.
Internet security firm Sucuri’s “Hacked Websites Trend Report 2019” has revealed that over 56% of all malware-infected CMS applications were out of date at the point of infection.
Software vulnerabilities are one of the leading causes of infections, and Sucuri found that 44% of all vulnerable websites had more than one vulnerable plugin present. Their research also revealed that 10% of infected sites had at least four vulnerable components.
WordPress websites running outdated versions of Contact Form 7 are particularly vulnerable to attack. A nasty permissions vulnerability in Contact Form 7 5.0.3 and older was discovered (and patched) in 2018, which allowed a logged-in user in the Contributor role to edit contact forms, which only Administrator and Editor-role users have permission to access by default.
Despite the vulnerability being patched almost two years ago, there are still thousands of websites at serious risk of being hacked, purely because they’re using an old version of the plugin.
Sucuri also discovered that well over two-thirds of websites using PHP are using outdated versions that are no longer supported and are therefore not receiving any security updates.
As of December 2019, PHP 5.x, PHP 7.0 and PHP 7.1 are no longer supported – and yet almost 67% of PHP websites are still using one of those versions!
Keeping your applications, themes and plugins up-to-date is vital for security, but it can also help make your website faster and more stable as well. Just make sure you backup your website before making any significant changes to your site!
4. Non-secure web pages
Even if your website doesn’t process or store sensitive information, you should still protect it with HTTPS.
Apart from the critical security and data integrity it provides for both your website and your users’ personal information, HTTPS is a requirement for many new browser features, particularly those required for progressive web apps.
According to W3 Techs, HTTPS adoption has risen to 63.3%. That’s a reassuring statistic – but it means that almost 37% of websites still use the insecure HTTP protocol to transmit data.
I’ve carried out a couple of HTTP to HTTPS migrations quite recently, and I know for a fact that there are still quite a few local business websites in Surrey that use HTTP. It’s definitely less of a problem than it was just two years ago, but it’s still a significant problem.
There’s another really good reason why you should migrate your HTTP website to HTTPS as soon as possible: your search engine visibility could depend on it.
Ever since Google announced in 2014 that HTTPS was now a ranking signal, they’ve been gently pushing webmasters towards making the leap from HTTP to HTTPS.
Changing your website protocol isn’t a five-minute job; it does require careful planning in order to preserve your search engine rankings when changing HTTP URLs to HTTPS, and that’s probably why Google has been lenient on HTTP websites for a few years now.
But that’s about to change. if you have a non-HTTPS website then you had better get your skates on and migrate to HTTPS before October. If you don’t, you risk losing traffic and a “Not Secure” notification will pop up on your website for Chrome users. And nobody wants that.
If you’re not sure how to migrate your site from HTTP to HTTPS, I’ll be putting together an in-depth guide before the October deadline. Make sure you subscribe to the blog so you don’t miss it!
5. Cheap hosting
You get what you pay for with most things in life, and web hosting is no different.
Cheap hosting means shared hosting. With a shared hosting plan, your website “shares” a web server with hundreds – or even thousands – of other websites. If you’re reading that and thinking “That doesn’t sound good”, you’re right; it’s not good.
When you purchase a shared hosting plan, you’re allocated a specific amount of disk space, RAM, bandwidth, and other resources. Unfortunately, these resources aren’t always distributed evenly or fairly.
The reason for that is because most shared hosting providers work on averages. They know that not all accounts will use all the resources allocated, so they’ll pack accounts onto them like sardines.
This results in “overselling”, which basically means selling more resources than a server is equipped to handle. This enables hosting companies to squeeze as much revenue from each server as possible.
If your site is crammed onto a server with thousands of other sites that are gobbling up most of the resources, it’s going to load so slowly that your visitors are going to think they’ve travelled back in time to the days of dial-up internet access.
Overselling web hosts to be wary of include Hostgator, Bluehost, iPage, TMDHosting and SiteGround. I’m not going to link to any of them because, well, I don’t like any of them.
Spending a little bit more on a quality web host will pay you back tenfold. Your site will load faster, work better and generate more conversions as a result.
If you’re on a budget, Cloudways’ managed cloud hosting plans start at just $10 per month and offer outstanding value for money. Customer support is a little lacking at times, but the features and overall performance are truly exceptional for the price.
If you have a little more to spend and world-class customer support is of paramount importance, Kinsta is a fantastic choice. It’s much more expensive, with plans starting at $30 for one website and 20,000 visitors per month, but the features, performance and technical support are superb.
6. Overly large images
After cheap web hosting, huge images are the most common cause of painfully slow web pages. It’s pretty obvious when you think about it, but that doesn’t stop millions of webmasters around the world from plastering their sites with images you can see from space.
I see it all the time. I’ll often get asked why a particular website is loading so slowly, and invariably it’s because the owner hasn’t taken the time to resize and compress their images before uploading them to their site.
The thing is, these days it takes almost no effort to optimise your images for the web – especially if you use WordPress.
There are numerous image optimisation plugins for WordPress that will compress your images automatically, such as EWWW Image Optimizer, WP Smush, Imagify and my personal favourite, ShortPixel Adaptive Images.
All of these image optimisation plugins work in pretty much the same way, but ShortPixel is a little different. Rather than just compress images as they’re uploaded, the plugin will resize, crop and optimise images on the fly, and then serve them in the next-gen WebP format directly from their content delivery network (CDN).
If the web page contains a 640 x 480px image and is viewed on a laptop, the image will be optimised and served from ShortPixel’s CDN. But if the page is viewed on a mobile phone, the image will be dynamically resized to 300 x 225px (for example) before being optimised and served from their CDN.
It’s really quite clever and works very well. In fact, we use it on the Megademic site.
If you install an image optimisation plugin (why wouldn’t you?), you’ll want to disable WordPress’ built-in image compression feature or you could end up with poor quality, pixelated photos. Luckily for you, I’ve put together a quick tutorial on how to stop WordPress compressing JPG images.
But what if you don’t use WordPress? These days most popular CMS and ecommerce platforms like Magento, OpenCart, Joomla and Drupal will have their own extensions and addons to help you automatically optimise your images.
There’s really no excuse for having a 365 gigapixel image on your site, so don’t do it.
7. Robots.txt errors
As I mentioned at the beginning of this post, it was the discovery of a serious error on a prospective client’s website that prompted me to write this post in the first place. Well, it was a robots.txt error – and it had literally destroyed his site’s search engine rankings and visibility.
The robots.txt file tells search engine crawlers which pages and files they’re allowed to request on your site. Not every search engine respects these instructions, but the ones that matter do.
You know that prospective client I was telling you about? Well, his robots.txt file contained the following code:
User-agent: * Disallow: /
That’s bad. Very bad. “User-agent: *” means the commands below are applicable to all search engines, while “Disallow: /” tells those search engines they’re not allowed to crawl any part of the site. Upload that robots.txt file to a live site and experience no organic traffic, ever.
Great if you want to save on hosting fees, but not if you actually want traffic, sales, enquiries and conversions.
A robots.txt file like that is typically added to a staging site as an extra layer of protection to stop search engines from crawling and indexing it. That’s exactly what happened here … except the previous web developer forget to remove/update it when he pushed the staging site live!
If you’ve recently had a new website built, visit www.yoursite.com/robots.txt and make sure it doesn’t look like the code in the one above. If it does, fire your web developer and get in touch with experts who know what they’re doing.
8. No XML sitemap
XML sitemaps can be beneficial to your SEO as they help Google quickly find your important pages, even if your internal linking isn’t perfect.
Google’s documentation states that XML sitemaps are beneficial for “really large websites”, for “websites with large archives”, for “new websites with just a few external links to it” and for “websites which use rich media content”.
While I don’t disagree with that, I think every website would benefit from an XML sitemap. After all, every website needs Google to be able to locate it’s most important pages, posts, products and categories.
And yet I regularly come across websites that don’t have one.
As you add more and more content to your website over time, it becomes harder to keep track of all the internal links on your site. Quite often, some pages will end up with no internal links pointing to them, making them harder for search engines to find.
With a regularly updated XML sitemap on your website, search engines will always be able to find your important pages and get a better understanding of the structure of your site.
WordPress SEO plugins like Yoast SEO, Rank Math and All in One SEO Pack generate an XML file for you automatically, making it even easier to let search engines know where your important information is located.
Most other modern CMS, ecommerce and web applications will have some form of XML sitemap feature as well, either as a core component or via a plugin.
If an XML sitemap feature is available, use it. And if you don’t have a way of generating an XML sitemap automatically, you could always use an online service like XML-Sitemaps to create one manually.
There’s literally no downside to having an XML sitemap, so make sure you add one to your site.
9. Dynamic URLs
10. Orphan pages
11. Broken links
12. No internal links
13. Images with no alt text
14. Poor mobile experience
15. Non-standard navigation
16. Inconsistent navigation
17. Generic navigation labels
18. Flyout navigation menus
19. Logo doesn’t link to homepage
20. Centred or right-aligned logo
21. Loading screens
22. Intrusive popups
23. Videos with no controls
24. Generic 404 page
25. Form errors
26. No CTAs
27. Too many CTAs
28. Social links at the top of the page
29. No clear value proposition
31. No terms of service
32. No ‘About Us’ page
33. No ‘Contact Us’ page
34. Outdated copyright notice
35. No social share buttons
36. No favicon
37. Inconsistent branding
38. Lack of white space
39. Unreadable text
40. Poor quality images
41. Homepage slider
42. No blog
43. Poor grammar
44. No page headings
45. Long paragraphs
46. Text-only content
47. Content that’s not focused on visitors
48. Outdated or irrelevant content
49. No Google Analytics tracking code
50. No Google Search Console account
Take the next step
Subscribe to our newsletter, where we share actionable advice and useful tips for building your brand and growing your online business.