We’ve just updated our index and wanted to clarify a few numbers that tend to fly around. We do have a VERY large index.
So where are we now?
We now know about, of 1.8 Trillion urls on the Internet.
Of those URLs,we have crawled 202 billion unique pages of data. Nearly 10 billion of these were crawled since our last update less than a month ago.
That’s 153 million root domains and 646 million subdomains. Infact it would be possible to cite numbers much larger than this, because many subdomains and root domains can be identified on the web, but they don’t in fact resolve. So I’m just not counting those.
To put this into a bit of perspective, we are now crawling more than 10 Terrabytes of data a day and that’s generally around 40 million URLs a day.
And the Bells and Whistles?
During our last update, we started reporting EDU and GOV link counts at the domain level. What we couldn’t show then, but now can, is the EDU and GOV link counts at the URL level. That’s pretty sweet – check it out yourself in our bulk backlinks checker.
Latest posts by Dixon Jones (see all)
- A New Approach to Blogging. Expect a Better Standard. - February 7, 2017
- Fresh Index Doubles (Historic Index Passes new Milestone) - January 26, 2017
- Alexa top 1 Million sites is retired. Here’s the Majestic Million. - November 22, 2016