In the previous article, we considered several of the most popular operating systems that are usually chosen for installation on VPS servers. And today we want ...
3v-Hosting Blog
15 min read
In web development and server management, the ability to manipulate URLs is crucial. One of the most powerful tools for this purpose is the .htaccess file, a configuration file used by Apache web servers. It allows administrators to execute various directives, including redirects and URL rewrites, without requiring access to the main server configuration files. The .htaccess file can control a wide range of web server functions, making it an essential component in the toolkit of web developers and system administrators.
In this article, we will explore all the intricacies of URL redirection and rewriting using the .htaccess file, examine the principles underlying these operations, study the syntax used in .htaccess, and discuss common use cases for these directives. In addition, we will discuss potential pitfalls and provide tips for troubleshooting possible problems to ensure that redirects and rewrites work correctly.
So, let's get started.
The .htaccess file (short for Hypertext Access) is a configuration file with powerful features used on web servers running Apache HTTP Server. It allows decentralized management of server configuration, giving webmasters the ability to control the settings of individual directories and their subdirectories. The .htaccess file is placed in the directory it controls and can contain various directives, including directives for redirecting and rewriting URLs.
Redirecting and rewriting are fundamental operations in web development. They allow you to change how users access your content and ensure that URLs are structured in a user-friendly and SEO-optimized way. Common scenarios include:
The .htaccess file acts at the level of the directory and all its subdirectories. This means that the rules defined in a single .htaccess file only affect the part of the site structure where the file is located. Unlike the global virtual host configuration, .htaccess allows you to override settings locally, making it a convenient tool for developers and administrators who do not have access to the main Apache configuration files.
To understand how Apache uses .htaccess, it is important to understand the order in which requests are processed. This will help avoid conflicts between rules and increase the predictability of the site's behavior.
1. The client sends a request to a specific path, for example:
/blog/articles/how-to-redirect
2. Apache searches for the corresponding file or directory in the file system.
3. Starting from the root directory of the virtual host, Apache sequentially checks:
/
/blog/
/blog/articles/
4. If a .htaccess file is found in any directory, Apache loads it and applies the rules before processing the next level of the path.
5. Only after applying all found .htaccess files is the main request processing logic executed (redirects, rewrites, serving static files, PHP execution, etc.).
Thus, the deeper the nested directory structure, the more times Apache has to load and interpret .htaccess files. This directly affects performance.
The main disadvantage of .htaccess is that Apache has to read this file for every request, even if the request is for a static file or a simple route. Virtual hosts, on the other hand, are loaded once when the server starts, so they execute faster.
This makes .htaccess a very flexible tool, but less efficient. For high-traffic projects, it is recommended to move rules (especially complex redirects and rewrites) to the virtual host configuration, completely disabling .htaccess.
However, later in this article, we will still look at several options for speeding up .htaccess.
The operation of .htaccess directly depends on the AllowOverride parameter in the Apache configuration. It determines which directives are allowed to be used in .htaccess files.
This directive can take the following values:
If AllowOverride is set to None, any rules written in .htaccess will have no effect, and this is a common cause of “broken redirects” when moving a site to a new server, so you always need to be careful in this regard.
Redirect or Redirect is the process of redirecting visitors from one URL to another. The .htaccess file allows for different types of redirects, each with different purposes. The two most common types of redirects are:
To implement a 301 redirect in an .htaccess file, you can use the following syntax:
Redirect 301 /old-page.html http://www.example.com/new-page.html
This directive tells the server to permanently redirect any requests coming to /old-page.html to http://www.example.com/new-page.html. For example, if you are moving your website to a new domain, you can use the following code to redirect all traffic from the old domain to the new one:
RewriteEngine On
RewriteCond %{HTTP_HOST} ^olddomain\.com$ [NC]
RewriteRule ^(.*)$ http://newdomain.com/$1 [R=301,L]
In this example, the RewriteEngine On directive activates the mod_rewritemodule, which allows you to perform complex manipulations with URLs. The RewriteCond directive checks whether the request originates from olddomain.com, and the RewriteRule directive redirects all traffic to newdomain.com, while preserving the rest of the URL path.
A 302 redirect is useful when you want to temporarily redirect users to another page. The syntax is similar to that of a 301 redirect:
Redirect 302 /old-page.html http://www.example.com/temporary-page.html
But this command temporarily redirects requests from /old-page.html to http://www.example.com/temporary-page.html.
The example below redirects all traffic from URLs starting with /promo to a special landing page, which can be useful during a marketing campaign.
RewriteEngine On
RewriteCond %{REQUEST_URI} ^/promo
RewriteRule ^(.*)$ http://www.example.com/promo-landing-page [R=302,L]
In addition to the most common 301 and 302 codes, there are two additional types of redirects that are used in more specific scenarios, especially where it is important to preserve the request method (GET, POST, and others). Let's take a look at them.
Code 307 is a temporary redirect, similar to 302, but with one key difference: Apache and browsers are required to preserve the original HTTP method and request body.
This is especially important when working with forms, API endpoints, and any logic points where POST cannot be automatically converted to GET.
Example of a 307 redirect:
RewriteEngine On RewriteRule ^api/v1/(.*)$ /api/v2/$1 [R=307,L]
This rule temporarily redirects API requests to the new version, preserving the request structure completely.
You may be wondering why we are talking about redirects, but the command starts with RewriteRule. This is because the module is a universal Apache mechanism and is capable of performing both internal URL rewrites and full external HTTP redirects if the rule specifies the R=301/302/307/308 flag, which turns RewriteRule into a redirect.
When to use a 307 redirect:
Code 308 is analogous to 301, but also guarantees that the request method is preserved. If the site uses POST requests and you want to perform a permanent redirect without breaking the logic, this is the correct option.
Example of a 308 redirect:
RewriteEngine On RewriteRule ^upload/(.*)$ /storage/$1 [R=308,L]
When to use 308:
Below is a table that will help you quickly understand which redirect is suitable for a specific task. It is especially useful in scenarios where SEO impact or correct processing of POST requests is important. In our opinion, the table format is more intuitive.
Table - Comparison of redirect types
| Redirect Code | Type | Preserves Request Method (GET/POST) | Passes SEO Value | When to Use |
|---|---|---|---|---|
| 301 | Permanent | No (often converts POST to GET) | Yes, almost fully | Site migration, URL structure changes, removing pages |
| 302 | Temporary | No | No | Temporary promotions, maintenance, testing new pages |
| 307 | Temporary | Yes, strictly preserves the request method | No | Temporary relocation of API endpoints sensitive to request methods |
| 308 | Permanent | Yes | Yes | Permanent migration of APIs or services where preserving the request method is critical |
After setting up redirects, it is important to make sure that the server is actually returning the correct response code and that the redirect chain is not causing any errors. There are several ways to check this, either through the console or through browser tools.
The fastest way is to use the curl command with the -I parameter, which requests only the response headers:
curl -I http://www.example.com/old-page.html
In the results, you will see a line like this:
HTTP/1.1 301 Moved Permanently Location: https://www.example.com/new-page.html
If the code is correct (301, 302, 307, or 308) and the Location field contains the desired URL, then the redirect is working correctly.
In any modern browser, you can test the redirect visually:
The list of requests will display response codes and transitions, allowing you to see even complex redirect chains or errors such as circular redirects.
For websites that are already in operation, it is useful to check redirects using link analysis services and webmaster panels:
These tools help you see:
Such checks are especially important after migrations, URL restructuring, or changes in domain structure.
While redirection changes the URL in the browser's address bar, rewriting changes the URL inside the web server before the request is processed by the server. This allows you to maintain clean and user-friendly URLs while keeping the basic structure of the site intact.
Enabling URL rewriting in .htaccess is done using the RewriteEngine directive. You must include this directive at the beginning of your .htaccess file to activate the mod_rewritemodule:
RewriteEngine On
The basic component of URL rewriting is the RewriteRule directive with the following syntax:
RewriteRule Pattern Substitution [Flags],
where:
Pattern: is a regular expression that matches the requested URL.
Substitution: specifies the new URL to serve if the pattern matches.
Flags: these are optional parameters that change the behavior of the rule (e.g., [R=301,L] for a permanent redirect and last rule).
Suppose you want to rewrite a URL such as /product?id=123 to a more readable format such as /product/123. You can do this with the following .htaccess directive:
RewriteEngine On
RewriteRule ^product/([0-9]+)$ /product?id=$1 [L]
In this example, the pattern ^product/([0-9]+)$ matches URLs such as /product/123, where 123 can be any number. The substitution /product?id=$1 rewrites the URL internally, allowing your application to process the request using the original query string.
If you are unfamiliar with the concept of regular expressions, we strongly recommend that you read up on it, as it will be difficult to work with rewrites without understanding regular expressions.
For more complex scenarios, you can use conditions and flags to control the flow of URL rewrites.
The RewriteCond directive allows you to apply conditions to rewrite rules. This is useful when you want to apply rewrites only when certain conditions are met. For example, you can redirect users based on their browser's user-agent string, which is useful for redirecting mobile users to the mobile version of your site:
RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} ^Mozilla [NC]
RewriteRule ^(.*)$ http://m.example.com/$1 [R=302,L]
In this example, users with browsers that include the string “Mozilla” in the user-agentparameter are redirected to the mobile version of the site.
Flags change the behavior of RewriteRule. Here are some of the main flags used:
Now let's look at an example of using several flags at once:
RewriteEngine On
RewriteCond %{HTTP_HOST} ^www\.example\.com$ [NC]
RewriteRule ^(.*)$ http://example.com/$1 [R=301,L]
In this example, the [NC] flag makes the condition case-insensitive, and the [R=301,L] flags ensure that a 301 redirect will be performed and no further rules will be processed.
In addition to basic URL rewriting examples, administrators and developers often use typical scenarios in their daily work to simplify the site structure, improve SEO, or optimize routing within the application. Below are the most common practical patterns with examples of rules for .htaccess.
This approach makes URLs cleaner and more user-friendly. Instead of a page like /example.php , you can serve a page at /example.
For example:
RewriteEngine On RewriteCond %{REQUEST_FILENAME} !-d RewriteCond %{REQUEST_FILENAME}\.php -f RewriteRule ^(.+)$ $1.php [L]
This rule checks whether the corresponding .php file exists and preserves the URL structure without extensions, making it cleaner and more readable.
If the API has an inconvenient or outdated structure, you can create cleaner routes. For example, instead of /api.php?version=v1&method=getUsers, you can use /api/v1/users.
For example:
RewriteEngine On RewriteRule ^api/v1/users$ /api.php?version=v1&method=getUsers [L,QSA]
This helps to unify API routes and simplifies their documentation.
Single Page Applications (React, Vue, Angular) use client-side routing. To ensure that all paths within the SPA are handled correctly, .htaccess must redirect any requests to index.html, except for static files.
For example:
RewriteEngine On RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule ^.*$ /index.html [L]
This ensures that requests such as /dashboard, /profile/settings, or /articles/123 will be correctly returned by the SPA instead of causing a 404 error.
When creating a multilingual site, it is often necessary to redirect requests to the appropriate files or controllers depending on the language in the URL, /ru/products or /en/products.
Example:
RewriteEngine On RewriteRule ^(ru|en)/products/?$ /products.php?lang=$1 [L,QSA]
This rule allows you to maintain readable URLs while directing requests to the core of the site or a specific processing file.
With the growing importance of HTTPS, many websites have switched from HTTP to HTTPS. Moreover, Google has recently announced its intention to penalize unsecured sites in search results. Therefore, you can use the .htaccess file to ensure that all HTTP traffic is redirected to HTTPS:
RewriteEngine On
RewriteCond %{HTTPS} off
RewriteRule ^(.*)$ https://%{HTTP_HOST}%{REQUEST_URI} [R=301,L]
This rule checks whether the connection is unsecured (RewriteCond %{HTTPS} off) and redirects it to the HTTPS version of the same URL.
Consistency in URLs, such as always using or omitting trailing slashes, is important for preventing duplicate content issues. You can also control this with .htaccess:
RewriteEngine On
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_URI} !(.*)/$
RewriteRule ^(.*)$ http://www.example.com/$1/ [R=301,L]
RewriteEngine On
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_URI} (.*)/$
RewriteRule ^(.*)/$ http://www.example.com/$1 [R=301,L]
You may need to redirect URLs based on query string parameters. The following example redirects users from index.php?page=contact to contact.php:
RewriteEngine On
RewriteCond % {QUERY_STRING} page=contact
RewriteRule ^index\.php$ /contact.php? [R=301,L]
Although redirect and rewrite are often used interchangeably, they actually serve different purposes:
But their use can be combined within a single web server configuration. Consider, for example, a scenario where you want to redirect all traffic from www.oldsite.com to www.newsite.com, but within the new site you want to rewrite the URL for cleaner paths. You can combine redirect and rewrite rules in your .htaccess file:
RewriteEngine On
# Redirect old domain to new domain
RewriteCond %{HTTP_HOST} ^www\.oldsite\.com$ [NC]
RewriteRule ^(.*)$ http://www.newsite.com/$1 [R=301,L]
# Rewrite URL structure on the new site
RewriteCond %{HTTP_HOST} ^www\.newsite\.com$ [NC]
RewriteRule ^product/([0-9]+)$ /products/details.php?id=$1 [L]
This setting ensures that users visiting www.oldsite.com/product/123 will be redirected to www.newsite.com/product/123, and then this URL will be internally rewritten to products/details.php?id=123 for processing by the server itself.
Below is a table that more clearly shows the difference between redirects and rewrites.
Table - Comparison of Redirect and Rewrite
| Characteristic | Redirect | Rewrite |
|---|---|---|
| Purpose | Move the user to a different URL and change the address in the browser | Modify the internal server path without changing the visible URL |
| Visible in browser | Yes — the address bar updates | No — the browser sees the original URL |
| SEO impact | Yes — redirects are evaluated by search engines | Indirect — used to create clean URLs but does not change the indexed address |
| Use cases | Domain migrations, fixing broken links, moving pages, canonicalization | Creating clean URLs, application routing, flexible query parameter handling |
Now let's return to the issue of performance. Although the .htaccess file provides flexibility and allows you to make changes to the web server configuration without accessing the main Apache settings, its use can significantly affect site performance. This is especially critical for high-traffic projects, sites with deep directory structures, and applications that process a large number of requests. To reduce unnecessary overhead and ensure maximum efficiency, it is important to consider several optimization principles.
To reiterate, with each request, Apache is forced to reread the contents of all .htaccess files located on the path to the requested file or directory. Unlike a virtual host configuration, which is loaded once when the server starts, .htaccess is interpreted dynamically, which creates overhead.
If a website contains dozens or hundreds of directories, each with its own .htaccess file, the resulting delay can be quite significant.
The order of rules in .htaccess also directly affects performance. Apache processes RewriteRule and RewriteCond from top to bottom, and each rule is checked until a stop flag (e.g., [L]) is encountered.
For optimization:
^(.*)$, at the end;
This reduces the number of comparisons and decreases the likelihood of unwanted matches.
Each RewriteCond directive is an additional check that takes up processor time. If there are too many of them, performance can drop dramatically.
Recommendations:
For high-performance projects and servers under constant load, it is recommended to completely abandon .htaccess if you have access to the virtual host configuration. This offers two key advantages:
The use of .htaccess is justified in only two cases:
In other situations, moving the redirect and rewrite logic to VirtualHost provides a noticeable performance gain and better manageability.
One of the most common problems when using .htaccess redirects is the creation of an infinite loop, where a redirect rule repeatedly redirects a URL to itself. This can be avoided by carefully structuring your conditions and ensuring that your rules are mutually exclusive.
Example of a safe redirect:
RewriteEngine On
RewriteCond %{HTTPS} off
RewriteCond %{HTTP_HOST} ^example\.com$ [NC]
RewriteRule ^(.*)$ https://www.example.com/$1 [R=301,L]
Here, the conditions check both the protocol and the host to avoid redirect loops.
Although the RewriteLog directive is deprecated and not used in recent versions of Apache, it is still useful for debugging on older servers. Enabling logging will help you see how your rules are being processed:
RewriteLog “/path/to/rewrite.log”
RewriteLogLevel 3
For newer versions, you may need to read the web server error log or apply custom logging settings to debug .htaccess issues.
URLs with special characters such as spaces, &, or % can cause problems with .htaccess rules. These characters must be properly encoded in your rules, or you may encounter unexpected behavior.
Example of a URL redirect with special characters:
RewriteEngine On
RewriteRule ^my\ file.html$ /my-file.html [R=301,L]
In this example, the space in my file.html is escaped with a backslash to ensure that the rule is processed correctly.
The .htaccess file is used not only to manage redirects and URL rewriting, but also to protect the file structure of a website. When configured correctly, it can block access to confidential data, hide service directories, and restrict access to specific IP addresses. This is especially important for PHP sites, CMS with open directory structures, and projects that contain sensitive data.
Below are the most commonly used protection techniques that help prevent leaks and unauthorized access.
If you need to completely prohibit viewing the contents of a directory, you can use the simplest rule:
Deny from all
This approach is used for service directories (cache, storage, private, temp) that should not be publicly accessible.
Many projects contain critical configuration files that should not be accessible through a browser. This applies in particular to:
Example of blocking such files:
# Prohibit access to system files and directories
<FilesMatch “^(\.env|\.git|composer\.json|composer\.lock|package\.json|Dockerfile)$”>
Order allow,deny
Deny from all
</FilesMatch>
# Blocking the entire .git directory
RedirectMatch 404 ^/\.git
This rule prevents the leakage of environment variables, tokens, secrets, and repository history.
Sometimes you need to allow only specific users to access certain files or directories, such as the control panel or administrative scripts. To do this, you can restrict access by IP:
<RequireAll>
Require ip 192.168.1.10
Require ip 203.0.113.25
</RequireAll>
As a result, only the specified IPs will be able to access the directory or open the file, and the server will return a 403 Forbidden code to all others. To deny access to everyone except the IP list:
Require all denied
Require ip 192.168.1.10
Require ip 203.0.113.25
If the Options Indexes directive is enabled, the server can automatically display a list of files in the directory, which is dangerous. To disable auto-indexing, write:
Options -Indexes
This is a simple but essential security measure for any website.
In modern versions of Apache, the classic RewriteLogdirective, which was used for debugging rules, has been completely removed. Therefore, mod_rewrite diagnostics is performed through the web server's system logs and the logging detail level setting. Proper log configuration allows you to see exactly how Apache interprets RewriteRule and RewriteCond, which rules are triggered, and which are skipped.
In most cases, it is sufficient to enable detailed logging through the main Apache error file. To do this, increase the level of detail:
LogLevel warn rewrite:trace3
The value trace3 shows extended information about how the mod_rewrite module processes requests: matches with regular expressions, condition results, redirects, and the final processing route. Trace levels can reach a value of 8, but it is recommended to enable them only temporarily, as they create very large logs.
Example of enabling:
ErrorLog /var/log/apache2/error.log LogLevel alert rewrite:trace3
For more detailed diagnostics, you can create a separate log containing data about rewritten requests:
CustomLog /var/log/apache2/rewrite.log “%t %{REQUEST_URI}e %{REDIRECT_STATUS}e”
This log helps you see:
Fields can be expanded using Apache environment variables, which mod_rewrite sets automatically.
Typical paths where logs are stored on operating systems of different families and other software:
/var/log/apache2/error.log/var/log/httpd/error_log/usr/local/apache/logs/error_logIf RewriteRule does not work, the first place to check is error.log. It always displays messages about incorrect syntax, conflicting rules, or lack of permissions to use .htaccess.
The most common reasons:
Check the Apache error log - it usually indicates why the rule did not work.
When processing a request, Apache first checks all RewriteCond conditions that come immediately before RewriteRule.
RewriteRule will only work if all conditions associated with it return true.
In other words:
If the rule behaves unexpectedly, the problem is most often hidden in the conditions.
To completely disable the use of .htaccess, simply set:
AllowOverride None
in the VirtualHost configuration or directory:
<Directory /var/www/html>
AllowOverride None
</Directory>
After that, Apache will stop reading and applying rules from .htaccess, which provides maximum performance.
Yes. Apache processes .htaccess at every level of the path — from the root directory of the site to the directory containing the requested file.
For example, when requesting /blog/articles/post-1, Apache will sequentially look through:
However, the more such files there are, the lower the performance. Duplicate rules can also cause conflicts.
.htaccess itself does not affect ranking, but managing redirects through it is a critical SEO practice, because:
As a result, incorrect or circular redirects can lead to traffic loss, so it is important to test the rules.
Working with redirects and rewrites in .htaccess is one of the key tasks when administering websites on Apache. Properly configured rules help maintain a clean URL structure, ensure correct migration of pages and domains, improve usability, and preserve SEO metrics.
Understanding how the server processes RewriteRule, Redirect, and RewriteCond conditions allows you to build configurations of any complexity, from simple redirects to multi-level routing schemes. At the same time, it is important to consider the order of rules, avoid loops, and test the behavior of the configuration in practice.
The examples and techniques discussed in this article provide a solid foundation for working with .htaccess and will help you confidently implement and debug the necessary changes. For a more in-depth study of Apache's capabilities, it makes sense to refer to the official documentation, which details all available directives and URL management mechanisms.
If you want to try out the techniques described in practice, then you can rent an inexpensive VPS server and practice configuring web servers on it.
Learn how IP addresses work: IPv4 vs IPv6, public and private IPs, DNS resolution, routing, security basics, and how IPs are used in real server and cloud infra...
Accelerating WordPress at the Nginx level: correct PHP-FPM settings, try_files, static files, caching, Brotli, wp-login protection, and secure headers for stabl...
Effective backup strategies for Docker applications: how to protect volumes, data, and configurations while avoiding common mistakes, and quickly restore servic...