We are today announcing another new update. This is our second this month and the team worked hard to turn it around in record time – just 16 days.
Why is this cool?
A year ago, it was taking us at least six weeks between creating a “stop” point for the crawl, and then running the program to build the index. Yes – it was taking six weeks at least to process the data into a new index, and that was with just a trillion known URLs. Now with well over twice the known internet universe, we have cut that down considerably, largely due to our TeraFlop server.
Anything else new?
The lag between us starting our index update and getting the data online is also down considerably on this update. If you are lucky, you will find links in this update which we discovered as recently as the 11th of September – just nine days ago.
Can I see data fresher than nine days ago?
If you want to see links fresher than nine days ago, you need to be ALREADY TRACKING root domains in our advanced reports to see fresh crawl data. You cannot start tracking this data until a few days after the ordering the advanced reports.
So how big is the index now?
We now have 2.5 trillion urls in the index.
Latest posts by Dixon Jones (see all)
- Outbound Links and Language Data lands in the Historic Index. - November 22, 2017
- New Functionality: Outbound Links and Language Upgrade - October 23, 2017
- Majestic and SEMRush Combine Forces - October 11, 2017