People often ask what metrics I use when looking for domains.
Here’s my long-winded answer, in order of the tools I use.
I use Moz metrics, because even though out of the big 3 they have by far the smallest index, they do have the best metrics. I look at both PA and DA, firstly at DA to see the overall domain strength (DA20+ is worth developing, I personally focus more on 30+ though they are harder to find). Then I look at PA to see where I should launch the homepage of the site (www vs nonwww vs subdomain vs subpage), and how I should setup my redirects (A standard launch note would be “Redirect non-www to www, install 404 redirect and verify the /blog/index.html is redirected”). I used to sell links based on PA but because the link gets recycled into archive eventually, I’ve found DA is actually the better long-term metric.
I use Ahrefs for two thing, the tag cloud on the overview page, and the top pages page. The tag cloud is great because you can quickly identify spam based on a glance at the anchors as well as the percentages (anything with over 50% is spam). Top pages is just the easiest way to check which pages require redirects /wp-installs, whether you need to create subdomains, or whether there was a lot of subdomain spam.
ARCHIVE.ORG Fantastic tool. Parked pages are fine. Often sites will be put up as autoblogs and left and never link built, and these tread the line. If there’s no evidence in both OSE and Ahrefs of any suspicious links, I will often still pick these up. The difference is I will launch them and wait for indexation before developing them. I completely ignore pagerank. I completely ignore Majestic.
Bonus Tip! When you find a clean domain, analyze its backlinks to find more gems. That means a quick scrapebox + xenu search followed by a bulk domain check and a netpeak check for PA/DA.