Caching The Inside

1 minute read


There are limited an known numbers of internal IP addresses. You’ve got the 10’s, 192’s and 172’s. That’s it. If I were to create a web page - like any other web page - made up of just http links to the full range of those IP addresses would Google (or any other search engine) attempt to crawl those pages? I hear you interrupting me ‘but it won’t work, those are internal addresses’… Ding! That’s right. Would your search engine crawl them? Would it cache them as well? It would affectively crawl it’s entire - own - network. Couple that up with ‘caching’ and low and behold - you can now search inside the indexed Google network. Or that’s my theory anyways…

A few potential problems with this though. While attempting to create a page with links for all those IP addresses I realized even with the most minimal amount of HTML to create the links the page would be over 200MB in size. Would a search engine even attempt to retrieve, parse, and index that? And if it did, would it cause a denial of service for the spider resulting from all the timeouts?

If anyone can think of a way to take this further do so - and fill me in on the results!

As an Amazon Associate I earn from qualifying purchases.