Search engine optimization

Search engine optimization or SEO (search engine optimization), is a set of actions aimed at improving the positioning of a website in the results list of Google, Bing, or other Internet search engines. SEO works technical aspects such as the optimization of the structure and metadata of a website, but it is also applied at the content level, in order to make them more useful and relevant for users.

Natural or organic positioning

The natural or organic positioning is the one that achieves a web spontaneously, without mediating an advertising campaign.10 It is based on the indexing or indexing carried out by applications called web spiders for search engines. In this indexing, web spiders roam the web pages and store the relevant keywords in the database.11

The webmaster’s interest is to optimize the structure of a website and its content, as well as the use of various link building, link baiting or viral content techniques, increasing the notoriety of the web, due to the increase in mentions. The objective is to appear in the highest possible positions of the organic search results for one or several specific keywords.

The optimization is done in two ways:

  • Internal / On-page SEO: Through improvements in content. Technical improvements in the code. Accessibility. A / B test, etc.
  • External / Off-page SEO: It seeks to improve the notoriety of the web through references to it. This is achieved primarily through natural links (referral traffic) and social media.

Search engines usually show organic or natural results in an area, along with payment results. The positioning in these payment areas requires the payment of certain special services, such as Adwords or Microsoft Ad Center, and is known as search engine marketing (in English, search engine marketing, often abbreviated SEM).

The AdWords service can be hired by impressions (number of times our ad will appear for a specific keyword) or by clicks (number of times that in addition to leaving our ad printed, it will be visited or clicked by the customer).

Techniques to improve positioning

The activities to be carried out involve both changes in programming, design, and content, which are aligned to the guidelines issued by search engines as well as good practices. Search engines such as Google12 and Bing13 have issued guidelines in this regard.

It is divided into internal and external positioning:

Internal positioning

These are improvements that the web developer can apply to the site in terms of content, appearance, accessibility, etc.

  1. Responsive web design Since April 2015, in a new algorithm change, word spreads that Google will penalize with a considerable decrease of position in the SERP (Search Engine Results Page) to those websites that they lack adaptability to mobile devices.14 Moz says that it is not exactly so and that although a website is not responsive, at the moment, it does not have to descend in the ranking
  2. That the web can be easily tracked by search engine spiders is the first step. Search engine computers must have access to the web page in order to process and display it in search engines. For that reason, the crawl budget or crawl rate directly influences the positioning: the higher the frequency of crawling a website and the more pages it crawls, the better its positioning. This point would include points that are specified below, such as making pages more accessible, eliminating duplicate content, fixing errors 4xx, 5xx, 7xx, making the websites as light as possible so that the tracker consumes fewer resources.
  3. Create quality content. The saying of “the content is king” is common. Since 2015, Google assigns more and more important to the so-called “user web experience”, being able to measure it in statistical terms as long as that particular website has been indexed by this search engine. The user experience is related, above all, with the adaptability to mobile devices, the content mentioned above, the usability and the speed of loading time, among other factors. Similarly, the internal link structure is key to usability and user experience (UX, User eXperience).
  4. Performing the structuring and design of a web page thinking about positioning, means paying attention to it being functional, easy to access and that captures the user’s attention.
  5. Create unique titles and relevant descriptions of the content of each page. Each page is a business card for the search engine. The titles and descriptions are starting points for the identification of the relevant terms throughout the web by search engines. Best practices recommend writing titles between 60 and 70 characters
  6. Make the web as accessible as possible: limit content in Flash, frames or JavaScript. This type of content does not allow the tracking or tracking of information by the robot in the different pages or sections. For them, they are a flat space through which you cannot navigate.
  7. Internally link the pages of our site in an orderly and clear manner. A “website map” in the code (both Google and one present on the site) will allow the search engine to pass through the different sections of the site in an orderly manner, improving its visibility. Including RSS files that can also be used as sitemaps.
  8. Improve the user experience with design improvements and decrease rebound rates.
  9. Host the web on a reliable server.
  10. Optimize the URLs, we place the most important and significant keywords for the search. Friendly URL or friendly URL.
  11. Install an SSL certificate and use HTTPS links throughout the page for internal and external links.
  12. Create a clean web design in advertising and deliver the relevant content in the upper half of the website.
  13. Optimize the loading time of a website to reduce bandwidth, increase the conversion rate and improve the user experience. (Web Performance Optimization or WPO)
  14. Use of HTML5 and its different sections (header, body), as well as XHTML5, etc.
  15. Responsive web design Since April 2015, in a new algorithm change, word spreads that Google will penalize with a considerable decrease of position in the SERP (Search Engine Results Page) to those websites that they lack adaptability to mobile devices.14 Moz says that it is not exactly so and that although a website is not responsive, at the moment, it does not have to descend in the ranking

External positioning

These are the techniques used to improve the visibility of the web in online media. As a general rule, we seek to obtain mentions in the network, in the form of a link, of the web to be optimized.

  1. Get other related thematic websites to link to your website. For this it is interesting to perform a search for those terms that you consider should bring traffic to your website and study which of them have complementary content. If for example, you want to position yourself for the term “hairdresser Madrid” it may be interesting to try to get backlinks from hairdressers in other cities.
  2. Right now there are hundreds of social networks, for example, Hi5, Facebook and, Orkut, in which to participate and get visits from our new “friends.” For Google, the social network that has the greatest impact on SEO is Google Plus, which has taken the place in the importance of Twitter and Facebook.
  3. Sign up for important directories like Dmoz and Yahoo !. Directories have lost a lot of interest in search engines but are still a good starting point to get links or the first search of your website by search engines. Both require a human filter for inclusion which ensures the quality of the added websites, but also slows and hinders their inclusion.
  4. Register and participate in forums, preferably in thematic forums related to the activity of your website. Frequent participation has to be accompanied by real and valuable contribution to be taken into account as a qualified user, the detail of success to get visits and increase positioning is the link to your website presented in your signature
  5. Write articles on other websites. Articles are a very powerful method to improve positioning and attract visitors. If you can write some articles of a course, of the tricks of the day, the usefulness of the product of your website.

Traits of search algorithms

Public features: Understand these to those that are of a nature declared by the administrators or creators of the said algorithm, for example, we can know that Google’s algorithm has certain technical aspects that penalize certain actions of web administrators or content editors. A practical example is Google Panda, whose main objective is to eliminate copied, duplicate or irrelevant content because it is considered Web Spam also called SEO Spam. Google Penguin instead searches for the technical characteristics of web pages, such as loading times, image optimization, damaged links, etc. I also see an important human factor for Seo, such as the bounce rate of a web page.

Private features: They are those that are kept secret by the creators or administrators of said content, this is so that a person can not see the algorithm in its entirety and draw up a fraudulent web positioning strategy.

Suspected features: These are secret features discerned in the practice of the activity, they are not official, but when doing the search engine optimization practice certain characteristics of these algorithms are seen, for example, He has discovered that despite everything said by the creators, the most important thing for search algorithms is the creation of relevant, fresh and necessary content.

Linkbuilding for Search Engine Optimisation.

Linkbuilding or link making, is an SEO technique that consists in getting other web pages to link to the page that interests the search engines to consider relevant and position it better in their rankings. The technique can be done naturally when other websites link without a prior agreement for some fact or saying, or artificially when it is simulated that the links have been achieved naturally.

This is based on the concept that one of the factors included in the evaluation of the ranking of a page is the number of inbound links a page has, a concept based on the fact that the number of inbound links constituted one of the factors evaluated in PageRank in 1999 by Google

The advantages are:

  1. Possibility to measure the demand and quantity of people who are searching through a keyword.
  2. Positioning effectiveness.
  3. Brand positioning or branding.

Techniques for SEO

  1. Registration in directories: consists of registering the web in different directories, whether general or thematic. It means entering the links in relevant directories and choosing the category that best suits the page. Since 2013, Google does not take into account the directories, which has disabled this strategy.
  2. Article directories: it consists of writing articles to publish them in directories that, in exchange for the content, allow links to a website to be included.
  3. Bookmarking: it is about saving what interests to position in the search engines in the different websites of bookmarking.
  4. Link baiting: it is one of the most valued techniques by search engines but one of the most difficult to achieve, since hundreds of links to an article are only achieved if it really adds value.
  5. Link exchange: a good way to get links and one of the first ones that began to be used. There are also many types of exchanges and services.
  6. Buying links: More effective than exchanging links but also more expensive. According to the official Google policy, this way of getting links is penalizable.
  7. Links from forums: Another way to build links is from forums, by adding the link (or link) from the forum signature.
  8. Other techniques: sending links to blogs, social networks, writing reviews, press releases, among others.

History of SEO

Web page administrators and content providers began to optimize websites in search engines in the mid-1990s, as soon as search engines began to catalog the first Internet. In the beginning, sites like Yahoo! they offered the inclusion to the sites that requested their own indexation, which was manual.

In the beginning, all the web page administrators had to do was send the address of a web page, or URL, to the different engines, which would send a web spider or web crawler to inspect that site, extract the links to others. pages that the website had and return the information collected to be indexed. The process involves a web spider belonging to the search engine, which downloads a page and stores it on the company’s servers, where a second program, known as an indexer, extracts information about the page. Among it, the words it contains and where they are located, the relevance of specific words and all the links that the page contains, which are stored to be subsequently tracked by the web spider.

Website owners began to recognize the value of having their pages well positioned and visible to search engines, which created an opportunity for users of white hat and black hat SEO techniques. According to expert Danny Sullivan’s analysis, the term search engine optimization began to be used in August 1997, 2 by John Audette and his company, Multimedia Marketing Group, documented on a page of the company’s website.

The first versions of the search algorithms were based on the information provided by the administrators of the web pages, such as the keywords of the meta tags, or files indexed in engines such as ALIWEB. Meta tags offer a guide to the content of each page. Using metadata to index a page was a not too precise method since the words provided by the website administrator in the meta tags could be an inaccurate representation of the actual content of the web page. Inaccurate, incomplete and inconsistent data in meta tags could cause, and caused, certain pages to be positioned too high for irrelevant searches.3 Web content providers also manipulated a number of attributes in the HTML source code of their pages in an attempt to position them well in search engines.4 Other sites, such as Altavista, admitted payments for appearing in the first places or giving more importance to older sites.

Due to the importance of factors such as keyword density, which depended entirely on the administrator of the website, the first search engines suffered abuse and manipulation of rankings. To provide better results for its users, search engines had to adapt to ensure that their results pages showed the most relevant searches instead of unrelated pages, filled with keywords by unscrupulous web page administrators. Expecting that the success and popularity of a search engine are conditioned by its ability to produce the most relevant results for any search, allowing the results to be false would make users opt for other search engines. Search engines responded by developing more complex ranking algorithms to rank websites, taking into account additional factors to make them more difficult for web administrators to manipulate.

Students graduated from Stanford University, Larry Page and Sergey Brin, developed Backrub, a search engine that was based on a mathematical algorithm that punctuated the relevance of web pages. PageRank was the name of the number calculated by the algorithm, a function that has the amount and strength of incoming links. PageRank estimates the possibility of a web page being viewed by a web user who randomly browses the web, and follows links from one page to another. Actually, this means that some links are stronger than others, so a page with a higher PageRank is more likely to be visited by a random user.

Page and Brin founded Google in 1998. Google attracted loyal followers among the growing number of Internet users, who liked its simple design, motivated by the fact that the founders did not know HTML and simply placed a box of Search and company logo.

What is Backlinks and how to make backlinks.


Related Posts

SEO adice media

SEO CheckList 2020

Optimize your domain Rate your website by knowing exactly what is important for local search engine optimization. Site crawl to

Read More »