Optimal keyword repetition

OPTIMAL KEYWORD REPETITION

With the numerous ongoing advancements in the world of SEO, so many people find themselves circling back to a rather basic question: If I’m targeting a certain keyword, where exactly should I use that in the front and back ends of my page and how often? To clear up any doubt surrounding this commonly asked question, we’re going to chat about keyword use, repetition and overuse. This may seem like quite a basic topic for some but it’s actually become rather advanced over the last few years and still appears to be a hot topic for discussion. Questions often asked include, “how many times should my keyword be used in my URL, H1 tag or title?” along with, “how many pages should I have that target this keyword?” What you need to remember is that Google really has evolved.

When you search in Google, you’ll find on occasion (when you’re searching for something a little more obscure perhaps), that the exact sentence you searched for may not even appear on page one. The reason for this? Google now connects topics and keywords thanks to their algorithms and tries it’s hardest to understand what it is you’re searching for. This obviously has an impact on your use of keywords so let’s begin by taking a look at keyword repetition considerations.

Considerations for keyword repetition

There are three primary considerations that we need to consider:

Search result snippet – this is one of your first considerations. You need to ask yourself, is the search result snippet informative and useful? Does it tell a potential customer something they may not have already known or perhaps a little bit about what it is they’re looking for exactly? Is it useful and can they apply that information? Will it help them to accomplish the task they set out to complete? You may also want to ask yourself whether it will catch the eye and lead a user to click or in the very least, does it appear as relevant and trustworthy?

Keyword analysis algorithms – this is quite classic and where a lot of SEO’s can go wrong, especially those that may be stuck in their ways. Back in the day before Bing, they may have looked at the count and the repetition numbers, looking for those less frequently used terms and seeing if you had a higher concentration of those in your document than other people did. Keyword analysis would most certainly cover keyword matching. Google would give you a little boost if you mentioned the exact phrase that had been entered into the browser for example, “Are western saddles more comfortable?” while everyone else had simply put, “western saddles for sale.” If you know that’s what your article is about and you want to target certain people looking for that particular answer then go ahead. Topic modeling is another area that will be looked at so consider some terms and phrases that are frequently used when you think of western saddles. If other websites are using them but you aren’t then your document may not be considered as relevant. For all the algorithms know, you could simply be talking about your dog that’s called “western saddle”. So they’ll look to see if you mention any other things that could be related to saddles such as the word “horse” or “pony” before they realise what it is you’re talking about. If you don’t use related words and phrases then the algorithm will miss you entirely. Google will also look at Intent analysis where they’ll try their hardest to understand a user’s intent. Google has a large store of knowledge around previous queries from over ten years worth of searches, which gives them the ability to figure out the intent of a particular keyword search so you may find you don’t rank as well for informational searches as you do for purchase-based searches.

Searcher opinions and engagement – If searchers scan your search result snippet and don’t deem you worthy enough of a click, this matters. Within a split second, searchers have asked themselves whether they should click back or reengage, whether they should share and amplify and whether they should remember your brand. It isn’t just a searcher that will ask these questions, a search engine will too. Google will monitor and measure such activity to see if people click through or pass you completely. Bing did a big study on this a while ago asking themselves, “Does that sound like a spammy domain name?” This resulted in highly hyphenated domain names, along with a lot of AdSense all mixed in, being seen as spammy and as a result, people now tend to steer clear.

How many times should my keywords be repeated?

With these three considerations in mind, we now need to move on to a few quick rules of thumb that can be applied to 95% of pages out there. First up, yes you should have your keyword that you’re targeting in the title element of the page. Your primary keyword should also be in the headline too although this isn’t simply because the H1 tag is important (it doesn’t actually matter if it’s in H1 or H2 or even H3 for that matter) as long as the letters at the very top of the page that make up the headline include your primary keyword. This allows your user to instantly know they’ve clicked on the correct page for their search and they have a consistent search result.

This goes down really well. It’s hugely important from a psychological perspective so that people don’t click back and choose a different result that better satisfies their need. Within content, you should be aiming to include your primary keyword around two or three times though that is a rough rule. Generally speaking, your keyword should pop up at least a couple of times in your content on the page unless you have a visual page or an interactive page with no content. You’d then look to put your keyword perhaps once in your meta description. Your meta description is important due to the snippet aspect of it as it might help boost your click through rate. It could help you appear more relevant to the searcher and will help to target that. There are the odd occasions, say that remaining 5% of cases, when a snippet will be much better off without the keyword. This will usually be the case when the keyword phrase is quite long.

Secondary considerations to keyword optimisation

There are of course always a few secondary considerations to think about such as image optimisation that may contain your keyword in the image alt attribute. The image file name itself is equally important for SEO purposes as images still get a good amount of search traffic. Even if you don’t receive a particularly large amount of click through traffic via the image, you may have people using the image and citing your website for it which then creates a link. Although we’re talking about a long trail, we’re talking about a valuable one.

You might also consider adding the keyword once in the URL. Although this isn’t critical, it’s still important. You could think about placing it one or more times in the subheaders too. If you find yourself with multiple blocks of subheaders that happen to be describing different attributes of a piece of content then you could also use your keywords in there if they apply. Just be careful not to go overboard as search engines use something called stemming. This basically means that they’ll look at the word “skeleton” for instance and cut it down to “skelet” so if the word “skeletal” or “skeletons” or even “skeletals” occurs then they’ll be a lot of repetition due to minor variations. This will be deemed as totally unnecessary and can in some cases annoy searchers as well as search engines that may look at this as keyword stuffing.

On-page keyword use

On-page keyword use is only a very small part of the algorithm. It’s so small in fact that you could get this absolutely perfect and you’d still only see a small difference in terms of your ranking so we’d urge you not to spend too much time on this subject. Do think about your searchers’ intent and your target topics along with questions they might have. Search engines are very smart about this and have topic analysis and intent analysis models which means a page that talks about “western saddles” but fails to mention the words “horse” or “pony”, would cause the search engines to assume you aren’t so relevant. You could have a thousand extra links pointing towards you but a search engine algorithm would still deem you less relevant and subsequently rank you lower. It’s also true for searchers that are searching for a topic that you simply don’t answer. You may get clicked on but you’ll soon have your potential users clicking back.

Keyword optimisation certainly gives you a lot to think about and can often leave people feeling hugely confused but don’t fret, you needn’t feel confused any longer. If you’d like some expert help and advice, contact Wecan Media today. Their SEO experts will not only advise you on keyword optimisation but how to optimise your website as a whole.

Source – Wordstream, The Wordstream Blog, Keyword Optimization: Why Optimizing for the Right Keywords is 'Do or Die'

Source – SEO Nick, On-Page SEO – How to Optimize Any Page for Your Target Keywords


Causes of your Google penalty

Causes of your Google penalty

In our previous article “Identifying a Google Penalty”, we gave you the information needed to tell if you’d been hit by a search engine penalty as well as the tools to establish just what type of Google penalty your website had been hit with. Now you’ve established just what Google penalty you’re dealing with, it’s time to move on and find out what exactly caused it. Once you get to the root of what caused it, you’ll then be able to fix it.

If you’ve suffered a manual Google penalty then you may well receive a message within your Webmaster tools or better yet, an email direct to your inbox giving you ideas on how you might improve your chances. If your website has a bad link profile then you might see a warning page within your “manual actions” page on the webmaster tools stating that you have unnatural links to your website. If the cause is due to your content however then you’ll see a comment stating that you have content with little or minimal added value. If you’re suffering from an algorithmic penalty however and are yet to hear from Google, then chances are you won’t be getting any help from Google in figuring out what it is exactly. You’ll be pleased to know that it doesn’t matter what state your website is in, there will always be room for improvement and this article is going to look at a rather comprehensive list of potential Google penalty offences. We won’t be going through every known factor as some factors will only ever affect you positively and whilst it is beneficial to include such positive factors, they won’t get you penalised. Typically, Google penalty offences will fall into three different categories: content, link and technical issues. Let’s first take a look at content.

Content related Google penalties

Google algorithms can really struggle to tell the difference between good and bad content despite the fact that Google have been trying their best to refine this for years. Instead of looking at the general quality of the text, Google algorithms look at the tell-tale signs that are nestled deep within the content. These are signs that can result from certain patterns that are consistent with spam content such as excessive advertising or very obvious plagiarism. Google algorithms can also pick up on a lack of patterns that good content usually has too. More often than not, it will be a case of these certain signs building up and reaching a certain threshold. If a website has been marked for a manual review then an employee of Google will look to see if it’s a legitimate website and simply unlucky or if it’s actually trying to play the system with the content. There have been cases where these so-called tell-tale signs have been so plentiful that a serious Google penalty has been triggered automatically. Good examples of this would be keyword stuffing or duplicate content on a large scale. Here are some of the most significant tell-tale signs:

  • Duplicate content – it’s super easy for an algorithm to check if your content was copied or duplicated elsewhere on the Internet by comparing it to all other content on the net. There are a number of plagiarism tools that can be used in order to check your content such as paperrater.com
  • Spun content – these can be carried out by black-hat SEO tools that use a number of different synonyms in order to recreate sentences in a number of ways. The problem with these, aside from the moral point of view, they tend to read badly for people and will often make no sense at all. Spun articles may be able to duck under the radar of algorithms but they carry a big risk. Generally speaking, content that’s 80% unique and above won’t be flagged by any search engines.
  • Thin content – if your website has a high bounce rate or a high rate in terms of returning to a search engine then this would flag potentially thin content. This can place a bad mark against you, especially if you have a lot of visitors who tend to leave your website as soon as it’s loaded. You can check your bounce rate and average times via Google Analytics.
  • Over-optimising – when SEO was first practiced it was often done by stuffing content with keywords however search engines have since improved and keyword stuffing is now easily spotted and penalised against. Keywords really are a game of balance. You must appear as though you aren’t trying too hard. To protect against over optimising you could use synonyms and keywords related to your main ones.
  • Hidden content – this is another tactic from days gone by when people would smuggle invisible keywords into a page that were camouflaged by a matching text and background colour.
  • Content farms – the panda update had the intention of taking down all content farms that appeared to populate the web. This relates to what’s known as user generated content that doesn’t offer any use and is there simply to grab traffic and advertise to your visitors. If your website has a forum then you could be in danger of triggering this type of Google penalty. Keep safe and moderate it constantly.

Link related Google penalties

Websites being ranked based on the amount of links that were pointing to their website was what put Google above everyone else, it was their differentiation. It’s a rather elegant system in that just one website of high standing, referencing your content, would be taken as a vote of trust and relevance by Google. This can quickly break down however when people play the system using hyperlinks by linking to websites for money or linking back to their own websites via articles posted externally. If you’re suffering from a Google penalty and you believe it might have something to do with your links then take a look at this list to see if anything jumps out. Do note however that this list isn’t exhaustive:

  • Exact match anchor text – the majority of the links that point back to your website would naturally display your website’s name, your name or the domain name. This is how the majority of people link to your website when they’ve done it themselves and this is what appears natural to Google.
  • Reciprocal linking – swapping links in excess is a big indicator of foul play i.e. a taxi-company would want to swap links with a hotel in the local area. If however he swapped hundreds of links about other things such as poker or knitting then it would appear to have no benefit to his customers and as a result would trigger a Google penalty.
  • Paid for or rented links – quality pages can be fished out from poor ones by the number of websites that link to it. Any sign of manipulation is a huge violation. Paid links are thrown up in a rather random way so if you’ve paid for links previously, you should take a look at how they were done. Get in contact with the provider of the links and ask for a copy of their work.
  • Link networks – in 2012, many of the world’s link networks were targeted and abolished which also resulted in the websites that had paid for links being penalised too. Link networks do exist however they’re far more underground than they once were as they’re targeted heavily by Google. Being linked to one of these is like a ticking time bomb. You’ll be found out eventually and will ultimately pay the price. It doesn’t matter how clever the owner of the link network is, they ultimately get caught and you’ll regret the day you contacted them.
  • Advertorials -  if people are paying for ad links on your website then Google will see this as a clear violation of their guidelines.  If you’re going to sell ad space then make sure you do it through a legitimate and reputable advertising network.
  • High link velocity – if you were brand new to the web yesterday and suddenly today you have twenty thousand links pointing your way, what would that suggest? Sometimes, a high velocity of links will happen for natural reasons, say for instance, something of yours goes viral but more often than not it’s yet another sign of foul play.
  • Links to and from spam websites – this is very similar to high velocity links in that the quality of the websites is often poor. Just as your own reputation can be tarnished, it can hurt your rankings too. Swapping links with spammy websites or even with malicious owners that hack into yours and plant links in your content can have terrible consequences.
  • Site wide links – if you link to another website on every single page of yours then it can often be seen as a manipulation. This is also the case if a website does the same to you. There are of course legitimate examples of this say for example if you want a particular link on most pages in order to move people through the sales funnel but most often the case isn’t so sincere.
  • Hidden links – any link that’s hidden from your visitors will be treated as suspicious. This sort of offence can be made when your link text matches the background colour and becomes hidden or even merely difficult to see. You can also hide a link in your script files however, for those using this technique, search engines have become rather good at spotting them.
  • Affiliate link overuse – linking to affiliate products is perfectly fine but cramming a page full of such links will set alarms bells ringing. This is especially the case on websites with thin content.
  • Broken links – whether they point internally or externally, if the link brings up a 404 error page then it won’t help the user experience. There are a number of tools around that will sweep your website for you and help you to locate and eradicate such links.

Technical issue related Google penalties

Things called search bots will look at your website’s code and crawl through it whilst it analyses content and links. The code is the foundation of your website and your online presence and therefore a technical issue could hamper it hugely. If you make sure your website is running smoothly then there’s much more chance of lasting over a long period of time. Here are a few examples of technical issues that would lead to Google penalties for you to look out for:

  • Missing site maps – although an XML site map isn’t mandatory, it does help Google as it lets them know whenever you post new content and prompt Google to crawl your website more often.
  • Down time – yes your servers might crash every now and then but if the problem persists for a longer period of time than normal and the search engine notices then it can indicate to them neglect.
  • Website speed – your website can become bogged down over time when extra code is added along with plugin after plugin. To maintain a fast loading time you should keep larger files on a separate cloud base storage server. Slower speeds will severely affect you.
  • Bad history – your past can indeed come back to haunt you especially if you’ve just grabbed a new domain. Even if the old one has expired, there might be a lot of spam links still pointing to it creating problems down the line.
  • Reported to Google – absolutely anyone can report your website, whether it’s genuine or not it doesn’t make a difference. If your website is clean then you have nothing to worry about but if you’ve been flying under the radar then you’ll be flagged and issued with a Google penalty.
  • Hacking – the more well known and successful your website becomes, the greater the chance it will become hacked. By using a popular CMS like wordpress or not keeping plugins up to date, you’ll also make you a bigger target. Get your website security in order and prepare as best as possible.
  • Cloaking – last but not least you have cloaking. This is a technique whereby the search engine bots are given different content than what appears in the browser window. Cloaking is a very old practice and is likely to have been put in place a good while ago.

For more information on deciphering the different types of Google penalties and where you might be going wrong, contact Wecan Media’s SEO experts today.

Source – Random Byte, New Wave Internet Marketing, SEO, SEO Penalties, Google Penalties: Info and Recovery

Source – Search Engine Land, SEO, 8 Tips For Google Penalty Recovery


Social media and SEO

Social media and SEO

Just a decade ago, search engine optimisation was pretty straightforward. All you had to do was make sure the search engines could crawl your website, use the necessary keywords and get yourself as many links as possible. As the search engines became more and more sophisticated in terms of delivering very accurate and personal results, the basic SEO signals were no longer enough. Social media however, specialises in the signals that search engines now crave; identity and relationships. There’s a huge amount of power when it comes to identity and relationships and the best way to describe it would be to compare it with everyday life. If a complete random stranger came up to you and told you that the new hybrid BMW i8 was amazing for twenty different reasons and was waiting for you at the dealership just three streets down from where you were standing, would you want to go check it out and find out more? No, chances are you’d find it hugely suspicious and run a mile. If your friend, with whom you share a passion for sports cars with happened to say the same thing however, you’d no doubt be far more likely to walk the three streets and take a peek. That’s the real power of identity and relationships that can be demonstrated by social media.

Search engines are now currently working on ways to make their results even more personalised than ever before. They want their results to contain more content that’s directly relevant and this will either be based on location, your past behavior or from people you know or may trust more. Social media brings a huge amount of identity and relationship data to the search engines and their algorithms. Very early on in 2012, the head of Google’s core ranking team Amit Singhal spoke about identity challenges with website called Search Engine Land. He discussed how a good product could only be built if the manufacturer understood who was who and who was related to whom. This means that in order to build a good product we have to carry out a huge amount of processing but, it’s worth remembering it isn’t just about the content – it’s about identity, relationships and content. Amit Singhal also went on to explain that anything else would simply trivialise a very hard product.

How social media feeds SEO

When you understand just how important identity and relationships are to search engines and subsequently SEO, you can begin to change your marketing behavior in order to put yourself in a better position to benefit as the tide swells. Simply blasting Facebook posts that link to your website however won’t instantly give you more link authority. This is due to reputable social networks stripping the link authority from the links off to the other websites by using 302 redirects or nofollow attributes. The question therefore: where exactly does the value to organic search come from? The following three ways, that’s how:

·          Indirect link building – if no one actually gets to see your content then no one can link to it. Social media can be a very powerful way to expose millions of people to your engaging content. The more that see it, the more likely it is to earn you reshares that expose even more. Increasing the number of reshares will increase the number of people seeing it and will therefore increase the number of people likely to link to it on a another blog or website that does pass link authority back to your website.

·          Personalisation – social relationships also create an opportunity for your content to show up in that individual’s personal feed. For example, if George likes your “hints and tips” content you posted on Facebook, then his own friends list will also see that he likes it. Likewise if George likes your hints and tips on Google+ then the next time his friend Steve searches for hints and tips, he’ll likely see the page of yours that George previously liked. Personalisation like this benefits your ability to rank along with your visual appeal and will also boost trust based on the relationship between George and Steve. Multiply this interaction by 1,000 or 100,000 and you can start to understand the widespread impact personalisation can have on a brand in the right social networks.

·          Performance data – social listening data includes sentiment data and topical data. In can also inform keyword research for search marketing. The benefit obviously swings both ways in that keyword data can also inform social media strategies too. These are all windows into your customers’ desire and needs.

Those aren’t the only benefits social media has on your SEO strategy

With these findings, some have taken their love for social media a little further than expected declaring SEO on the way out and social media standing firmly in its place. Whatever the future may hold, that’s not currently the case, nor is it ever likely to be especially with the current relationship that search engines and social media have with each other. Of course there is a certain amount of merit to this argument simply because of the huge way in which social media has impacted the way in which we learn about and share content but by no means has it taken over the way we look for a website. Social media has undoubtedly become one of the main factors that search engines now take into account when it comes to indexing content so let’s dive into to some of the other ways in which social media will continue to help with your basic search engine optimisation strategy and boost your company’s rankings within the search engines .

·          Author authority – ever since Google launched Google+, search engines have become a lot more integrated. Google now allows authors to associate their content with their Google+ profile page in order to ensure their account and bio become linked to content within the search results. By adding your “author” tag to your website, linking websites that you’ve written for in your bio of your Google+ page and including your Google+ link in every article you write, your author listing will appear in the search engines. This helps to build credibility when it comes to your content appearing in search engines, which will gain you more recognition and trust with your audience. It will do this by pulling in the name, photo and number of followers the author has on their Google+ page in addition to the usual URL, tag and meta description. The benefits of an author’s ranking are growing way beyond offering a listing a place within the search results to offering far more visibility in any one search engine overall.

·          Speed up the time taken to index your account – as stated above, content shared across social media is now considered just as much as content on a website by the search engines. Content on social media is considered an indication of quality information and should be ranked as such. The more links you have on your website, the quicker the search engines are able to index the content within the rankings and since social media is able to quite drastically influence the amount of links any one piece of content receives in a far shorter period of time, it can often speed up the process of a search engine indexing. If your content has been tweeted about a huge amount then it can cut up the indexing time by an estimated 50% while also reducing the amount of time it takes Google to find your content from two hours to two seconds.

·          Better ranking based on friends and followers – just as a search engine will interpret more social shares of content as a sign of authority and credibility, the amount of social connections and followers or friends you have is also factor in how your content is then ranked. We’re not saying that a high number of followers an account has is all it takes. It’s about the number of quality connections and followers that will help a search engine determine whether you’re a reputable source as opposed to an account with a high number of spam followers. With this in mind it’s easy to see just why it’s so important to connect with influencers and advocates online who will be genuinely interested in your offerings in order to make sure they share your company’s content thus gaining your business even more traction within the search engines.

·          Boost keywords from shared content and profile – the keywords used within your social media channels will also factor into your content rankings within the search engines via two sources: the content being posted and the profile of each social media account. Begin by ensuring all of your social media profiles are brimming with information including your relevant business contact details and other necessary information. It’s important to do all of this whilst naturally including all the necessary keywords and the name of your company. Many aren’t aware of this but the name, URL and the bio are typically the most important aspect of all of your social media profile pages in terms of search engines. You must then continue to include the necessary keywords into the content you post across your social media. It can be easy to incorporate the keywords naturally by covering similar subjects to the searches people will be making in order to find your type of product or service.

Of course, this list is by no means exhaustive. There are a number of other such advantages that social media can have in terms of your SEO strategy but it’s important to keep in mind that identity and relationships are key. While they are beneficial however, they aren’t a complete replacement for SEO. Social media and SEO should be seen as partners, rather than alternative options. The same fundamentals of SEO still apply so if the search crawlers can’t find you or if you offer no textual content then there isn’t anything that will help to amplify you in order to rank. If you’d like to find out more about search engine optimisation and social media, or if you’d like to discuss further just how social media can benefit your SEO strategy then contact an SEO expert at Wecan Media today. 


Are the days of “above the fold” web design layouts a thing of the past?

The days of above the fold web design layouts a thing of the past

The old mantra used to be that content and imagery on web pages had to be “above the fold”. This recently has been deemed by many as an antiqued idea from the very earliest days of the web when people weren’t really aware just what the Internet was actually for or how to use the browser. Back then, the notion of “above the fold” web design mattered but that was way back in 1999. Today however, people know exactly what a browser is and they know just how to use it too. The term “above the fold” originates from the newspaper industry where the positioning of a story or image above the fold undoubtedly increased readership. Newspapers were folded and displayed flat meaning a compelling headline or photograph would increase sales. As a result, “above the fold” was born.

When the web was in its very early days, newcomers weren’t all too sure how a browser worked. Monitors were extremely small and the notion of the World Wide Web was all too much to comprehend for some. The idea of above the fold was therefore applied to web design. As the screens were small, only things within the boundary of the home page were deemed to be above the fold in web design. Back in 1999, this meant that something was only visible if it were placed within the 800x600 pixel dimensions of the home page subsequently making it more likely to be read, seen or clicked on. AOL further cemented the popularity of this concept of web design as its standard interface was 800x600, meaning everything was chopped up to ensure it was contained within the displayed area in order to keep visibility high. Due to this notion, articles were placed across multiple pages. If you wanted to read on further, you simply clicked onto the next page.

Above the fold doesn’t matter as much as it used to

We realise this comment is likely to cause a few heads to turn, a few people to suddenly become breathless but it’s ok. Visitors to your website aren’t going to run screaming from your homepage simply because they have to scroll. The reason for this is down to the explosion of the giant desktop monitor and the mobile web. Screen size changes constantly from one person to the next. Whilst one may be browsing an entire page on their 27 inch screen, another may be viewing it from their 3 inch smartphone and because of this, scrolling is a requirement for most. The fold it would seem has vanished; it has ceased to be. A number of studies have been carried out on the “above the fold” theory and they have all concluded that users of today will indeed scroll. A user-centric design firm here in the UK, CX Partners, has carried out a lot of research using eye tracking and has consistently found the so-called “fold” to no longer be relevant. In fact, their findings actually revealed that less content above the fold would encourage further exploration beneath the fold. They also found that if the design gave a tease to more exciting information below, scrolling was almost guaranteed as a result. What we’re talking about here is a bridge of sorts. In essence, if you have something to bridge the fold then people will scroll in order to see more. What’s more, people now fully recognise that scroll bars indicate more content for them to enjoy. They also now recognise that a scroll bar can give an indication as to the length of the page. Evaluation of click data has also found to support this notion. MilissaTarquini, the director of user interface design and information architecture at AOL, writes for boxesandarrows.com about her personal experiences. In her article “Blasting the myth of the fold” she provides real world evidence to fully support the concept that the fold simply doesn’t matter. One of the most interesting things she does mention however is the click data for TMZ. In her article she notes how the links at the very bottom of TMZ’s long pages are usually the most clicked upon links on the website which would indicate that users are more than willing to scroll long pages if the content is enticing enough.

Things above the fold must still be important

Now, it’s of course a no brainer to proclaim that things above the fold must still be interesting and important. The “above the fold” web design argument is simply an argument against scrolling itself as well as longer content on pages. The whole mantra behind “above the fold” design was to constrainweb design to certain screen dimensions but research has still shown that compelling content along with obvious visual clues that indicate more content exists below is still highly important.

Where is the fold?

The question is for some however, where is the fold exactly? Well, back in the 90’s when the vast majority of screens were 15 inches, designers knew they had les than 800x600 pixel dimensions to work with and as a result designed for 640x480. Today however, high-resolution monitors are more standard practice and as a result, aspect ratios vary a large amount. Desktop monitors can span to over 30 inches; you can even connect your computer to your LCD and plasma screens of 55 inches and more. Laptops come in all manner of sizes with different resolutions too and then there are of course smartphones and tablets.Therefore locating the fold isn’t necessarily an easy task today. The “fold” was once described as being at the very bottom of the page however if you open your browser on a 27 inch monitor then it’s likely the majority of web pages would be able to fully display within that height meaning no fold would exist at all. Open the same page on a smartphone however and the content would either resize to fit or you’d need to scroll. With this in mind, why would we stick to the regular 800x600 dimension? By sticking to the “above the fold” mantra, your content would inevitably become squeezed at the top of the browser which would undoubtedly do your web page a great disservice. Those viewing via a small screen would find your content edited and unnecessarily shortened whilst those on large screens would find themselves viewing a lot of unused screen and this is all because you or your client doesn’t believe that users scroll down the page. When computer monitors are now so huge or extremely tiny (as is the case with smartphones) it’s simply impossible to have a fixed dimension in terms of the “above the fold” on a web page. The notion may work for newspapers and magazines due to their consistency in size and it may have worked for the Internet in the early days but today there wouldn’t appear to be a fold as such. Users today are happy to scroll, in fact the majority would prefer to scroll and continue to read your content as opposed to viewing a number of different pages.

With no fold in sight, could infinite scrolling be the answer?

With multiple screen sizes now common place, the notion of infinite scrolling has become more popular however the question is, is it for you? Whilst it’s proven to be highly engaging for some websites offering exciting creative possibilities, it has backfired for others causing severe navigation problems. So what is infinite scrolling? Well, it’s a feature that allows your visitors to scroll through your website’s single page without actually reaching the end or having to leave the page. With no end, the users never reach the bottom and instead browse your content all in one go. This means no waiting for different pages to load, saving them time however others argue this could make them lose interest. A great example of infinite scrolling success would be social media such as Facebook and Twitter. Social media networks are living proof that infinite scrolling can indeed work when applied for the right reasons and in the right way however you need to decide if this is right for you. Take time to figure out the goal of your company. Is it an e-magazine? Is it content orientated with plenty of information and articles? If so then infinite scrolling might just be the perfect option for you. If you own an e-commerce website however, then infinite scrolling could be a huge mistake. This would be due to the lack of clear structure for your potential customers. Navigating your products and services would prove difficult leading to disorientation and a lot of frustration.

You need to very carefully define your audience and ask yourself if your visitors are looking for the information you’re giving them. Are they searching via their smartphones or tablets? Are they in search of a powerful visual? If so then infinite scrolling could work as it would allow you a way to present your content in a convenient and very attractive manner whilst getting rid of the hassle of tapping links. Infinite scrolling is also a great addition if your content happens to be image heavy. Instagram and Pinterest are particularly good examples. If your visitors aren’t looking for any of that however then a different experience would indeed be had. If they’re simply looking to buy your products or pay for the services you have to offer then you need a footer. This is something that infinite scrolling simply won’t offer. Equally, if your visitors are in search of FAQs or a Contact Us then infinite scrolling could put them off completely. Customers such as these are in search of clear and concise structure and that’s something you need to give them.

Infinite scrolling can be attractive and creative however it’s not for everyone. Equally, the “above the fold” mantra would appear to be something of times past. So where does this leave you? In need of a happy medium it would seem. If this would appear to be the very same position you find yourself in, or if you’d like more information on reinvigorating an old website then contact our team of web designexperts today. Our web design department at Wecan Media have the knowledge and expertise to really give your website the boost it needs for your specific requirements and that of your visitors.

Source – MilissaTarquini. Director of User Interface Design and Information Architecture at AOL “Blasting the myth of the fold” July 24th 2007 boxesandarrows.com

Source – Wikipedia, The Free Encyclopedia, Above The Fold


Web Design - User Interface

User Interface

Within the digital world user interface, or UI as it’s otherwise known, encompasses absolutely everything designed into a media device that a human being may interact with. This can include the screen, the keyboard, the mouse and even the appearance of the desktop. It also includes illuminated characters, help messages and how an application or website invites interaction and responses. In the early days, there was little interface except for a few buttons and the operating console. Interface was largely based around punched card input and report output. A few years down the line however and users were given the opportunity to interact with computers online. Here the user interface was virtually blank except for a command line, a keyboard and a set of commands that would exchange responses with the computer.

This command line based interface led to one where menus dominated. Lists of choices written in plain text were offered however it wasn’t long until graphical interface began to make an appearance. Graphical user interface or GUI as it’s also known was first adopted and enhanced by Apple Computer however it wasn’t until Microsoft got their hands on it that it was effectively standardised within their Windows operating system. What some people aren’t aware of is the fact that user interface design can also include “user experience” too which can encompass the aesthetic appearance of the media device, the response time and even the content that’s presented to the user.

Why is user interface so important?

User interface is important during the process of web design in order to design functional websites that can generate interest and online traffic amongst users on the Internet. You might not realise but user interface plays a hugely important role when it comes to bringing a lot of traffic to your website. It’s therefore crucially important that this is given the amount of time and involvement needed during the design process of the website itself. In the current climate, there has been a huge growth in e-commerce with billions of pounds worth of sales going through each year. The Internet has become such an integral part of business now that thousands of businesses are actually entirely dependent on the Internet when it comes to their success. If you want success in any online business then your website needs to be user friendly as this means it will provide the enhanced user experience that your online visitors want. If a website is simply too complex or too difficult to use then the online traffic that heads there will leave or in some cases be pushed away. Using simple and effective user interface design however will help immensely when it comes to achieving the specific objectives of your website.

On top of this, it won’t just help to increase the actual usability of the website but can lead to much smoother completions of the tasks at hand such as completing a transaction or signing up to a newsletter. On top of this, your users will find everything much more enjoyable. Typography will also play a vital role when it comes to enhancing the usability of your website. It’s so important in fact that it actually forms a little over 90% of all website components and requires a lot of special attention during the design process. Textual content has a very important role to play when it comes to making your website appeal to online visitors and should, as a result of this, be optimised for readability as well as convenience and balance in terms of graphics.

Using tools like online chat, email and even e-brochures that highlight products and services can also enhance your websites interactivity however world class results need the mind of an expert designer. It doesn’t matter if your website has amazing graphics, complete with all the bells and whistles because if your website lacks proper functionality then that enhanced user experienced you wanted will be absent. For great examples of websites that draw in high volume traffic due to enhanced user experiences, take a look at Facebook and LinkedIn.

Characteristics of successful user interfaces

When it comes to what’s deemed to be good user interface, there’s a lot of information about various designs and techniques along with solutions to common problems and recommendations. To simplify for you, we’ve listed to a few basic characteristics of what makes a good user interface. Here are a few things that we think your user interface needs to be in order to reign supreme:

  • Clear – clarity is hugely important. The entire purpose of user interface is to enable people to interact with your website easily by communicating meaning and function. If people struggle to figure out how your website works or where to go in order to complete their desired task then they’ll get confused and no doubt frustrated. This will lead to a high bounce rate.
  • Concise – being clear with your user interface is great but you need to be careful that you don’t fall into the trap of suddenly over-clarifying things. It’s all too easy to add definitions and explanations to things but each time you do it, you start to add mass which can become a little too much for your users to wade through. Keep things clear by all means but make sure you keep things concise too. A great way to do this is to cut things down a little so if you can explain a feature in one sentence instead of three then do it. If you can label an item or graphic in one word instead of two then do that too. By doing this you’re saving time for your users and making things easy and less effort.
  • Familiar – a number of designers go out of their way to make their user interface “intuitive” but what does this actually mean? It’s something that can be naturally understood or instinctively understood but how do you make something intuitive? You can do it by making things familiar. Something is familiar when you feel like you’ve encountered it before so you feel you know how it behaves and you know what to expect. Identify what feels familiar to your users and integrate these things into your user interface.
  • Responsive – responsive can mean a few things. First up, it can refer to speed and mean something is fast. The interface and the software behind it should work quickly. Waiting for a page or image to load can be frustrating. Seeing things load quickly on the other hand will immediately improve your users experience. Responsive can also mean that the interface will provide some sort of feedback. This means that the interface should talk back to the user and literally tell them what’s happening. For instance, did your user press the button successfully? How do they know if they have? Does your button display a ‘pressed’ state to give your users an idea? Is the page stuck or is it loading? Could you perhaps display a spinning wheel to show your users that something is loading?
  • Consistent – adapting to any given context is a smart thing but this doesn’t mean your interface shouldn’t adhere to a certain amount of consistency. By making your interface consistent, you allow users to develop usage patterns allowing them to learn what different buttons and labels mean. This will help them to recognise them and know what to do in different contexts. They’ll also learn how certain things work and that means they’ll be able to operate new features in an instant.
  • Attractive – this can often become a little controversial but you’ll find a lot of designers agree that a good user interface needs to be attractive. By attractive we mean that it makes using the interface enjoyable. Yes, you can make your user interface simple and easy to use but if you go that extra step and make it attractive then you’ll make the whole experience truly satisfying. When your user interface is pleasant to use, your users won’t just come back, they’ll look forward to using it too. Obviously, what looks good to one person might not look good to someone else but this just means you should fashion the look of the interface toward your audience. Adding a level of polish to your interface is quite different to loading it with a ten tonne of extra eye-candy.
  • Efficient –the user interface is like the car that takes you places. The places are the different functions of the website and a good interface will allow you to perform the functions faster with much less effort. To make your interface efficient you need to figure out exactly what it is your users are wanting to do and then allow them to do it without any fuss. You need to identify just how your application should work by looking at functions it should have. What goals is it trying to achieve? Then once you’ve answered these questions, implement your interface that lets people easily accomplish what they want to do as opposed to implementing access to a list of different features.
  • Forgiving – no single person is perfect and people are bound to make mistakes when using your website but how you handle these mistakes will be a hugely important factor in determining whether people still achieve the wanted outcome. Don’t punish your user for a simple mistake. Build a forgiving interface and remedy the issues that arise. Examples of a forgiving interface could be allowing your users to retrieve information they’ve accidently deleted.

It should be noted however that working on achieving just some of these characteristics could clash with working on achieving others. An example of this would be trying to make your user interface clear with too many descriptions and explanations thus making it too bulky in the process. You may then cut things out to make it concise and as a result create ambiguity. Achieving a perfect balance will take time and skills. If you don’t think you have the necessary knowledge and time to really get involved then contact our experts as Wecan media today.

C. A. D’H Gough; R. Green; M. Billinghurst. “Accounting for user familiarity in user interfaces”. Retrieved 13 June 2014.

McCown, Frank. “History of the graphical user interface (GUI)”. Harding University.


Wecan Media Live Chat