Today we caught our lead developer using Majestic to assess whether a snippet of library code was worth considering.
Bet you didn’t think to use Majestic for that did you?
The Situation
We were looking at an “Upload your CSV” routine, where a user can bulk check a list of URLs to look at our Flow Metrics, link counts and other data. It was all going swimmingly until marketing (aka me) poked their nose in and asked the developer to try the new beta by uploading a CSV output from Google. Oops… many of these reports had multiple columns and our upload system expected one. Damn marketing…
The Question
Not wanting to destroy all credibility with the development team – I offered a code library for consideration. OpenCSV.sourceforge.net.
The Analysis
Before Steve even looked at the code, he wanted to know if others used this code, believed in this code. Some code libraries have support – a community – and in these cases, code gets refined and improved. Others get overlooked, or only used by the original developer and these wither and die, suggesting a better alternative is available.
Majestic gave Steve some clues.
This particular code library looked pretty decent. Over 870 websites had linked to the code over the last five years and – importantly – nearly 200 had been reviewed and checked within the last 60 days. So the code has some following. It is either pretty good or really bad! We can also see that the links are unlikely to be artificially generated – because the trust and citation of the sub-domain is relatively similar. Artificial situations tend to cause artificial disparities in these numbers. Flow metrics in their 30s is nothing to be sniffed at.
Drilling down to the backlinks themselves, within two clicks we can see what the world thinks – starting with the most influential of these voices. I click on “backlinks” and then “remove deleted links“. Two clicks and less than 1 second wait.
The very top link tells it all. http://commons.apache.org/csv/ is on a hugely influential site and is itself a very trustworthy page. On that page we find that there is enough belief in the library that a company in Switzerland has funded development to further improve the library.
The Conclusion
Given that OpenCSV also comes with a commercial license, Steve didn’t need to go much further before allowing his developers to check out the code. Apparently… for a guy in Marketing… I didn’t turn out to be the biggest mug of the day in development.
The Corollary
This way of using Majestic data ad-Hoc to check the validity of a page or URL pushes Majestic’s uses way beyond search. For example:
A Twitter user may have 10,000 followers… but what’s the Trust Flow on their handle?
A commentator may have claimed to invent the wheel, but does the Mob care?
You are choosing between two competing products… which one should you choose?
Right. Now the development team have been outed, let’s hope the new technology that they are building won’t be far behind!
Using Majestic to evaluate a data source
Today we caught our lead developer using Majestic to assess whether a snippet of library code was worth considering.
Bet you didn’t think to use Majestic for that did you?
The Situation
We were looking at an “Upload your CSV” routine, where a user can bulk check a list of URLs to look at our Flow Metrics, link counts and other data. It was all going swimmingly until marketing (aka me) poked their nose in and asked the developer to try the new beta by uploading a CSV output from Google. Oops… many of these reports had multiple columns and our upload system expected one. Damn marketing…
The Question
Not wanting to destroy all credibility with the development team – I offered a code library for consideration. OpenCSV.sourceforge.net.
The Analysis
Before Steve even looked at the code, he wanted to know if others used this code, believed in this code. Some code libraries have support – a community – and in these cases, code gets refined and improved. Others get overlooked, or only used by the original developer and these wither and die, suggesting a better alternative is available.
Majestic gave Steve some clues.
This particular code library looked pretty decent. Over 870 websites had linked to the code over the last five years and – importantly – nearly 200 had been reviewed and checked within the last 60 days. So the code has some following. It is either pretty good or really bad! We can also see that the links are unlikely to be artificially generated – because the trust and citation of the sub-domain is relatively similar. Artificial situations tend to cause artificial disparities in these numbers. Flow metrics in their 30s is nothing to be sniffed at.
Drilling down to the backlinks themselves, within two clicks we can see what the world thinks – starting with the most influential of these voices. I click on “backlinks” and then “remove deleted links“. Two clicks and less than 1 second wait.
The very top link tells it all. http://commons.apache.org/csv/ is on a hugely influential site and is itself a very trustworthy page. On that page we find that there is enough belief in the library that a company in Switzerland has funded development to further improve the library.
The Conclusion
Given that OpenCSV also comes with a commercial license, Steve didn’t need to go much further before allowing his developers to check out the code. Apparently… for a guy in Marketing… I didn’t turn out to be the biggest mug of the day in development.
The Corollary
This way of using Majestic data ad-Hoc to check the validity of a page or URL pushes Majestic’s uses way beyond search. For example:
Right. Now the development team have been outed, let’s hope the new technology that they are building won’t be far behind!
Posted In: Commentary, Research / Case Studies