Cloaking is the way to deliver (using identical Queries) different pages and Content to users and search system robots.
Cloaking can be executed with the help of the file ".htaccess". In the process of cloaking a search system robot is tracked using, as a rule, either its IP or its user-agent (in the first case the robot can even come from a new IP, and in the second case the search engine robots can lie) and is redirected to another page, specially created for a robot.
Another method is implemented with the help of the PERL-script or ASP/PHP. When a request arrives to the Web server, the server first runs a cloaca script instead of returning immediately to an index file. The script compares the IP address of a requester to an available database. This base contains IP addresses of known spiders. If it turns out out that the IP belongs to a spider, a corresponding optimized version will be given. If there is no such address in the base, a public version is shown.
Today, even if the search engines do not notice it the use of cloaking does not give tangible results because off-page factors significantly influence ranking.