top of page

The 3 most common SEO mistakes that affect your website's ranking

Writer's picture: Camilo RamirezCamilo Ramirez

Updated: Jun 17, 2024



There are many factors that influence the ranking of our websites, and sometimes it's us who fall into mistakes that make us invisible in search engines, making it difficult for internet users to find our business. Therefore, it's important to know what the most common SEO mistakes are in order to identify them, correct them, and take preventive actions. In this article, our experts, Blas Giffuni and Camilo Ramírez, tell us about some of these mistakes we should avoid. Keep reading!


SEO Mistake #1: Accidentally blocking URLs or content on our website in the robots.txt file Sometimes, people focus a lot on developing the website, ensuring that everything works quite well in terms of functionality and usability, but they rarely check whether search engines can actually enter the website and index it. It often happens that people wonder why their website doesn't appear in search engines or why they don't receive traffic to it, and it could be because, out of technical ignorance, they are omitting one of the most important steps: checking the robots.txt file and validating whether the pages of their website are enabled to be found and indexed by search engines. Therefore, one of the most common SEO mistakes we find is that in the robots.txt file, there are pages of a website that are blocked from search engines.


What is the robots.txt file?


The robots.txt is a file that should be placed in the root directory of a domain; that is, on your website. This file allows you to indicate to search engines (whether Google, Bing, or any other) which pages of the website should be indexed and which we do not want search engines to access.


How do you check the robots.txt file?


Go to the root of the domain: for example, mydomain.com, the robots.txt should be found in the same directory as www.mydomain.com and, on the internet, this file would be found at "http://www.mydomain.com/robots.txt". Once in that file, you can review how it's written and if at any point you find a line that says Disallow, this indicates that the indexing of a certain URL or page is being blocked from search engines. Make sure that no content or section of the website that you want to be crawled and indexed by search engines is blocked.

Which pages (URLs) should be excluded or blocked from indexing? Test pages. Automatically generated pages (e.g., result pages that emerge on sites with search bars). Intentionally duplicated content or pages. Pages that are part of a marketing funnel and that only users who have completed certain steps or processes should view.


SEO Mistake #2: Not submitting the website to search engines When a brand is launched and no marketing is done, it's probably going to be difficult for people to discover and know the business. Similarly, if a website is launched and not submitted or notified to the search engine, it will be more difficult for it to be indexed, and if it's not, our website won't appear in search engine results. Google and other search engines do not rely on manual submissions because they have the technological capability to discover new sites or URLs through crawling. However, it's a good SEO practice to request manual review and indexing, because this way you can have direct communication with the search engine and ensure that it knows about the existence of your website.


This is where having a Google Search Console or Bing Webmasters account works quite well because that's where you can inform the search engine that you have a website and that you want it to be indexed. This is a fundamental and very easy step to take. Meanwhile, if this is not done, the only way Google or any other search engine will know if the website is working is: If there is a link from another website pointing to your website.


If your website appears on social media. If it was shared on online forums. But, this process is much longer and more time-consuming. Therefore, it's best to directly inform the search engine that we want it to review our website and the content present on it by submitting the sitemap or individual pages.


Do I have to notify the search engine every time I publish new content and/or a new URL?


It's recommended, but it depends a lot on the amount of content we generate and the frequency of publication. For example, if we are a portal like El Tiempo (a media outlet) that is probably posting news every two minutes, there is a special way to inform Google or the search engine that this content is categorized as news, and in this particular case, it's not necessary because the search engine already knows that the sitemap corresponds to a news site and needs to keep checking them. If our publication is not periodic, that is, there are no established days and times regularly for content publication, it's best that we create a new page, submit it to Google, and review it. Similarly, go through the process with any other search engine you want to index the content of your website. Keep in mind that the search engine limits the number of pages that can be requested for indexing daily. For this reason, the best practice is that as we finish, create, or publish a page, we immediately present it to Google.


SEO Mistake #3: Not talking about what your business does on your website Another common mistake in web content generation that affects SEO is being carried away by trends and not talking about what the business does; that is, not describing what it offers. If we don't give Google or the search engine an idea of what the business does, what it sells, and under what keyword we want to position the brand, it's going to be very difficult for the search engine to discover it on its own. These are information data that make the work of search engines easier. Because no matter how much they use machine learning, identify user search intent, and have many technological capabilities, it becomes necessary to have on your website what you do and offer. Additionally, in the global market, there is a trend for brands to become content producers with the aim of captivating audiences. The problem is that sometimes, this content ends up not being related to the core of the business or with what the brand offers because it's beyond the general scope of the business. If this is not done under a strategy, it can end up causing low-quality traffic to our website...


Quality of traffic over traffic volume


In terms of SEO, we should aim to create content that brings us long-term traffic; that is, continuously over time. A good piece of content does just that. For this reason, the mindset we should have when approaching SEO and planning positioning strategies is not to generate traffic volume; more importantly, it's about attracting quality traffic.


Therefore, when generating web content for our brands, we should ask ourselves, "how many sales can be expected from someone who comes to the website through content that has nothing to do with what the business offers?" If the answer is that it won't generate any sales, then this is not positive because we are attracting users to our website who will only come to review content and will not take any action, and even worse, if they are not part of our target market.

In reality, by generating content that has no relation to the core of the business, we are only incurring unnecessary costs because we are attracting the wrong audience to our website


With this information, you can evaluate your SEO practices! We invite you to identify if you are falling into any of these mistakes so you can correct them in time and become a rockstar in search engine positioning, thus getting one step closer to the coveted top spot in search engines.

Author's Picks

bottom of page