Not only that, we just need one that works the way Google used to work. You know, something like the old algorithm that made Google famous and made competitors obsolete.
The hard part of what Google does is spam and abuse prevention. If you only indexed known-good sources I think the hard part would basically be done. Providing good search results, rich queries, sentiment analysis etc is all old hat by now.
"Known-good" sources have a serious habit of getting owned by malware and replaced by organized crime schemes. Anti-abuse can't be handled by whitelists, it needs to be an online system.
Garbage sources have a serious habit of getting taken over. It's not that common with quality sites, but also not impossible. A simple downvoting system would be good enough.
The problem is that even quality sites die. And when they do, their domain gets poached and suddenly it's not a quality site anymore. But according to the ranking algorithms it's the same site, since its quality is measured by how many links it gets.
Coopting quality sites is a bit of a cottage industry. There is also a significant amount of hacked wordpress sites. Their domain may be quite reputable, but they're unknowingly host to a ton of spam.
I'm actually using Google's original algorithm, with a modification suggested in the original PageRank paper called Personalized PageRank that allows using a specific segment of the internet as a tastemaker in creating the ranking.
It's a bit funny because the authors of the paper suggest this modification and explicitly states it makes the algorithm extremely resilient to manipulation from commercial interests.
I think that if the company doing the search results wasn't also the company getting money from SEO spammers posting ads on their websites then the search results might be better.