Recently, Google has announced that they work on “artificial intelligence” with its own algorithm. This algorithm is the second most important as far as ranking factors are concerned. While many positioners have been occupied with the questions of the essence of the mechanism, I asked myself: “Okay, but what the first ranking factor is?”
In my opinion, based on six years of experience in SEO, the first ranking factor is TrustRank. It consists of a set of website characteristics, which Google uses to determine the level of confidence in such a website (understood as TrustRank – a rate known only to Google, in contrast to its followers like, for example, TrustRank applied at the Majestic Search Engine).
Let’s say that TrustRank is like a spot dimension and it exists in a multidimensional data space, where the number of dimensions is the number of factors used by the algorithm to determine the size of TrustRank. Assuming such factors exist and the influence of each of them on TrustRank is not the same, the size of TrustRank would be expressed as:
T/R {a1 x1 ,a2 x2 ,a3 x3 ,a4 x4 ,a5 x5 ,…,az xz}
Where:
- T/R – is the value of the Trust/Relevant
- xn – is the factor of the number “n”
- axn– is the share of the factor “xn” in the total size of the final result TR
- axn– is the finite and natural number
Factors “x” can be: TR of linking domains; TR of web page on which a backlink is placed; the quality of a server which is manifested, for example, in its unique IP or neighborhood, etc.
I’ve always had the impression that the changes of position in the search ranking measured with reference to a large number of phrases occur by leaps and bounds. My feelings confirmed well-known to me positioners, who in their talks about the issue in question, though rather generally, used terms such as “increases”, “drops”, and “stagnation”.
And it is not about the change of one or two places of a given phrase in that ”hierarchy”, but about something more significant, alteration of the position of our website in it (a global change).
Although, this article is something of a guessing game, or even a fantasy, I would like to share my hypothesis with the wider public. I treat the following digressions as the development of the sentence: If I were Google, I would do this.
Using the phrase “a position change”, I’m talking about changes of a global nature for the website in the further part of the article.

What are the relationships between these factors in a multidimensional space? One does not need to be an expert to say that they certainly are not linear. In fact, they can be seen as so-called (a) ”percolation threshold(s)” (I borrowed that physical term from the theory of graphs and in this case it means the boundary between two different types of structure), because it is more like a distinguishing border area, in other words: the point beyond which the change of position in a search engine takes place. Since presenting a multidimensional algorithm in the geometrical layout would be difficult, let’s focus on two dimensions.
Let’s assume that we take into consideration only two factors.
- X1 – the value of quality backlinks given to the website
- X2 – the amount of backlinks given to the website
- PP – set of percolation points as a function of x1 and x2, at which the change occur
- W – the field of values at which a web page gets a higher position
- N – the field of values at which a web page gets a lower position
- L 1 ,L 2 ,L 3 ,L 4 – websites with specific values x1 and x2 as well as unchanged values determining
- where the value of x1 included in the size L has the property of x 1(L3) < x 1(L2) < x 1(L4) < x 1(L1)
- where the value of x1 included in the size L has the property of x 2(L2) < x 2(L1) < x 2(L4) < x 2(L3)
This coordinate system shows that to get higher rankings on Google, it has to proceed from the field N (where L2 is located) to the field W, that is to achieve such quality links (x1) and the number of them (x2) to achieve a value point in the range of W (point L4). It raises interesting conclusions. It is not possible, despite the increase position, to have an adequate number of links but not having enough quality (as L3). However, having a certain amount of backlinks with insufficient distribution of them, accessing to a higher level is not possible (as L1).
If I would be Google and the chart above would be true, then thanks to it the following phenomena could be explained:
- addition of a further variable (eg. x 3) changes the position of points L 1, L 2, L 3, L 4 in the system and shape of the percolation point
- change in the slope of the percolation point reflects a change in the impact of individual factors ranked positions in search results
- change in the ratio of x1 and x2 in respect of the change of position does not develop linearly
- this decline in the quality domain maintainable may cause declines in our position, even with the increase in the number of backlinks (although not this is done proportionally)
- this decrease in the number of backlinks can cause a decrease in our position evenwith an increase in the quality of the other (though not this is done proportionally)
- there is value measuring quality of the backlinks without exceeding that it is not possible to achieve higher positions
- there is a quantity of backlinks without which does not meet the higher position
Google level of trust to the website reflected with rankings positions that can change abruptly.
The shape of the percolation point is no contractual in nature (although physics knows his model), and much more interesting is how Google might to maneuver to change the criteria for matching a party to their algorithm to obtain discrete positions.
It is no secret that Google has in recent years significantly increased the quality criteria of referrals to our website to take into account increases in position. For many positioners it was novelty. Thinking “old patterns” many of them after losing the current position continued efforts to raise further links further without worrying about their quality.
If I were Google and used the percolation curve, I would not allow for easy increase in position for a new links without the adequate quality.
It would be enough to move only after the right angle and the right slope of the curve percolation to remove pages which won’t ensure the quality score next time.
Suppose that we consider a case in which we take into account only previous variable x1 (quality of backlinks) x2 (number of backlinks), which in right proportions determines website status. Over time, Google decided to put emphasis on an algorithm which takes into account a slightly different ratio of these variables in order to obtain change of position, the result of which takes an effect of a shift PP 1 and PP 2. This situation is illustrated in the chart below:
In the chart through functions S1 and S2 denote a change in the relationship of quantity and quality backlinks to the website over time. As you can see, page S1 coped with changes in the algorithm as consistently pursue its policy of “link building”. Page S2 despite a good starting point and a similar position at S1 is not coped with the new orientation of the curve percolation and does not reach the position changes; despite further pursued a strategy of the “link building” yet having a bad assumption.
Consequence of the foregoing, there are many outputs and possibilities of geometric transformations even more. If I would be Google, I would manage mine own algorithm with moving in a multidimensional space. I wonder what Goggle’s respond would be and whether employees’ responsible for changes in the algorithm have the same opinion.

I’m really curious what do you think about that?
Thanks Pawel it’s interesting to read your perspective on this topic!
nice great blog it gives lots of knowledge
Good explanation, and good theory, but not very helpful in the real world. It is obvious, and well known that google in some way counts back-link quality as a function of quantity, but in what proportions, and how does one cheaply analyze and develop back-links of quality. In many cases it is not worth the effort; simpler to buy keyword, banners, displays, Facebook ads, etc. etc. etc.
Don’t get me wrong; I applaud your efforts, but I’m not sure how they are of value to the working SEO manager.
Yes, Larry. You have totally right. It’s just theory. But if you have known numbers that would be more value for SEO practitioners 😉
Well written Pawel. I appreciate your knowledge. A lot of SEO professionals fail to understand that if a website has a good Trust Rank/Score, it tends to rank high in the SERPS for all it targeted keywords even if there is no backlink to the those individual pages.
My 7 month old blog has a good Trust Rank and lots of inbound links. My individual blog posts gets indexed to first page in seconds after publidhing and its making hundreds of dollars every month from AdSense.
I believe the solution is to get good backlinks to a certain level where Google now trust your site, once you get to that level where you blog post begins to rank on first page easily, you begin to get editorial backlinks from other bloggers.
I basically start with grey hat techniques like blog commenting and move to white hat natural links once my site start to rank. I hope this helps. Thanks Pawel.
You wrote it great and surely you should take care of the authority of the site and its trust. However, it is difficult to see this indicator. Do you think it coincides with the popularity of the site according to Alexa?