On the off chance Skynet decides to target Sergey Brin and Larry Page, it’s going to have to override a text file to do so.
On June 29, 1994, Martijn Koster created the robots.txt file that tells web crawlers such as Google which pages on a site it should ignore. The file was soon adopted by all of the major web indexes of the time, and it became a staple of the Web experience.
Apparently Google wants to take it a step further and has introduced killer-robots.txt that recently appeared on its site. The simple file contains only four lines, but it could end up being the most powerful text file ever should Skynet ever come to power. (Although why it leaves off the T-X or T-850 is a mystery!)
At least Larry Page and Sergey Brin will be safe.
Clearly Google was just having a bit of fun when it posted this file, but no one is quite sure when exactly is appeared. The general consensus seems to be it was probably posted for the 20th anniversary of robots.txt, but no one is quite sure. At least the file covers the Terminator models from the first two films, and then rightfully ignores the rest of the films.
Perhaps every site should do this. I mean, sure, there are no Terminators yet, but a couple lines of text to save folks? Seems worth it in my book!