How do I fix my robots.txt?

My SEO tools are saying that my robots.txt is not crawlable. How do I check if it's a server problem in nginx?

1 Reply

Linode Staff

I'm not too familiar with using SEO tools or what configuration you have set up, but if you're using AIOSEO(All in One SEO), they no longer generates its own robots.txt as a dynamic page. Instead, it uses the robots_txt filter in WordPress to tap into the default robots.txt created by WordPress. It would be helpful if we had the exact error code that you're running into, but this forum post goes into detail about what could potentially cause that error and what solutions you could try to resolve it.

Additionally, if you're trying to check your robots.txt file in your NGINX configurations, I was able to find a few resources that could point you in the right direction:

This GitHub post in particular goes over how you can allow access to all User-agents in a robots.txt file.

Beyond the information that I provided, you may also want to reach out to NGINX's forums, as there may be knowledgeable users who may be able to provide in depth assistance.


Please enter an answer

You can mention users to notify them: @username

You can use Markdown to format your question. For more examples see the Markdown Cheatsheet.

> I’m a blockquote.

I’m a blockquote.

[I'm a link] (

I'm a link

**I am bold** I am bold

*I am italicized* I am italicized

Community Code of Conduct