The robots.txt file is one of the most important components of any website, playing a key role in interacting with search engines. This small text file instructs search engine robots and crawlers which parts of the site to check and index, and which parts to ignore. Proper use of robots.txt can help improve SEO, optimize server resource management, increase crawl speed, and prevent unnecessary or confidential pages from being indexed.
In this step-by-step tutorial, we will examine how to create and edit the robots.txt file in two popular hosting control panels: cPanel and DirectAdmin. By learning these steps, you can have more complete control over the indexing process of your website content and provide a better experience for users and search engines.
What is a robots.txt file and why is it important?
The robots.txt file is a simple but very important text file that resides in the root directory of a website’s hosting and acts as a direct guide for search engine crawlers and robots. This file tells search engines which parts of the website to check and index, and which parts to ignore.
Using this file, you can:
- Control crawler access to specific parts of the site
For example, you can block the path to administrative folders, system files, or sections with duplicate content so that they are not displayed in search results. - Prevent indexing of sensitive or experimental pages
Pages such as the admin login page, design test section, or content previews should usually not be displayed in Google. robots.txt is the best tool for restricting these pages. - Optimize crawl speed and quality
By preventing crawling of unnecessary sections, server resources are better managed, and search engine bots will spend more time reviewing and indexing important site content.
For this reason, properly configuring this file is one of the foundations of Technical SEO and can have a significant impact on the site’s performance in search results.

Creating a robots.txt File in cPanel
1. Logging into cPanel
First, log in to your hosting account’s cPanel. The login address is usually in the following format:
yourdomain.com/cpanel
2. Opening File Manager
In the cPanel dashboard, click on the File Manager option to access your website’s files.
3. Navigating to public_html
On the left side of the panel, find the public_html folder. This is where your website’s main files are located.
4. Creating the robots.txt File
- Click on the + File button.
- Enter the file name exactly as robots.txt.
- Click on Create New File.
5. Editing the robots.txt File
- Right-click on the created file and select Edit.
- Enter commands like the following:
User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php
Sitemap: https://yourdomain.com/sitemap.xml
- Save the changes.
Creating a robots.txt File in DirectAdmin
1. Logging into DirectAdmin
Log in to your DirectAdmin control panel. The address is usually as follows:
yourdomain.com:2222
2. Accessing File Manager
On the main page, click on the File Manager option.
3. Selecting the public_html Directory
Open the public_html folder.
4. Creating the robots.txt File
- Select the Create New File option.
- Name the file robots.txt.
5. Editing and Adding Directives
- Click on the file and select the Edit option.
- Enter the necessary directives similar to the example below:
User-agent: *
Disallow: /private/
Allow: /
Sitemap: https://yourdomain.com/sitemap.xml
- Save the changes.
Important Tips for Optimizing the robots.txt File
- Placing the File in the Site Root (public_html)
The robots.txt file must be saved in the main directory of the website (usually the public_html path or the domain Root). This is the default location that search engine bots check first when entering the site. If the file is located in another path, it will probably not be detected by crawlers. - Accuracy in Entering Paths
Be very careful when defining paths to be blocked or allowed. The slightest mistake in addressing can cause important pages to be blocked or sections that should not be in search results to be indexed. It is best to double-check the paths before saving.
Adding Sitemap Address
Adding a line containing the full address of the sitemap in the robots.txt file helps crawlers quickly identify and index the structure of the site’s pages. Example:
Sitemap: https://example.com/sitemap.xml
- Checking File Validity with Google Robots Testing Tool
After creating or editing the file, it is recommended to use the Robots Testing Tool or similar Google Search Console tools to ensure that the paths and commands have been applied correctly and that important pages have not been accidentally blocked.

Conclusion
Creating and managing a robots.txt file is not technically complex, but its impact on SEO and search engine access management is significant. Whether you use cPanel or DirectAdmin, by following the steps outlined, you can easily create or edit this file and have precise control over which paths should or should not be indexed.
By intelligently configuring this file, not only is the security and privacy of sensitive parts of your site maintained, but server resources are also used optimally, and important pages are indexed faster and more completely by search engines. The end result will be improved SEO performance and increased website ranking in search results.