The concept of using a way to measure a site’s force was first made popular by Google. In simple terms, each link to a web pages is a vote for that page. But it’s not as simple as “the page with the most votes wins.” The Original PageRank Algorithm (Sergey Brin and Larry Page) is the notion about using link as a ranking factor. How it works? Let’s see.
How Links Influence Search Engine Rankings?
Considering “PageRank” as a term no longer supported by Google I’m going to use the “Force” of each website according to my guesses as though it’s based on the same rules as PageRank.
First of all, the Force can a page pass on other pages through links less than the page’s Force be. If Force is “F”, real link’s force is represented by f(F). In 2009 Google told that a page might be able to vote 85-90% of its PageRank (Force). I think, it is less in these days and I think, it is less these days, it has downturns and it’s more manipulated by Google than before. I guess its value might fall to the minimum soon, at least in some of the industries.

So, passable Force are given an innate but tiny amount of Force. Unfortunately, pages may link to more than one other pages. When that happens the passable Force gets divided among all the pages receiving links. The empowerment is not equal and depends on where the links is located on a homepage.
However, we have very often cross-linking, that makes the force calculation much more complex. For example, when Page B links to Page A to make the link reciprocal, the Force of Page A becomes dependent on Force of Page B. In addition, the force that Page A passes to Page C is also impacted by the link from Page B to Page A. When we add that Page B link to other Page D we have a very complicated situation where the calculation of the Force of each page on the Web, which must be determined by recursive analysis.
There is a situation where the websitesare cross-linking:

Page A have f(A), Page B have f(B), Page C have f(C) and Page D have f(D)
The link force from A page splits between B and C pages. It can be presented by the following formula:
A passes the force to B as 0.5 x f(A) and then A passes the force also in the same proportions to the B page – 0.5 x f(A).
On the other hand, the B page links to the A page, giving it the part of the force it had received from A page (!) which causes, consequently, two issues: the B page lose the part of its force has been given in the link, the A page force grows. The contributions of those two values are different and seems like it’s known only to Google.
We can assume that the link force increase contribution in the mutual backlinks is greater to the page which had first given the force (or this link has been indexed by Google).
If contribution would be called as X and the number of outbound links as N we had a situation where N was 2 to the A page (N(A)=2) as well as for the B page (N(B)=2) and for C and D pages it’s 0 (N(C)=0, N(D)=0). But how to calculate the empowerment?
Imagine that that link force contribution described above for A page is X(A), for B page is X(B), for C page is X(C) and for D page is X(D).
It’s known only by Google, but let’s precise why indexing time is that important. If Google bot would start searching the Internet from A page, the next steps it would note down and deliver to Google Index the following data:
- Page A links to the B page passing the Force f(A) (this step N(A) = 1)
- Page B links to the A page f(B) passing its Force (this step N(B) = 1)
- Page B links to the B page f(C) passing its Force 0.5 x f(A) (imagine that the bot had just one possibility and chose the C page instead of B page),
- In the Google Index noted down that A page links to two pages (N(A)=2) and the f(A) Force decreases automatically to 0.5 x f(A) when A page delivers the the force to B page.
Search Results
Let’s stop here and look: it is not necessarily that A page linking to B and C pages will give them its force in the short time. The probability of disproportion between X(A and X(B) force transfering contribution depends on speed and size of the move and information downloaded by search engines bots.

This way of thinking explains the situation when new websites are high in the search results, they immediately decrease without any reason (when bots would cause reducing the impact of the links).
- On the C pages the bot will stop because it’s not linking to anywhere,
- The bot moves again to B page to pass its Force f(B) to D page as 0.5 x f(B) (it’s not N(B)=2 anymore) and decrease the value of Force passing to A page to 0.5 x f(B) causing reduce of its value in Google Index,
- On the D page the bot will stop because it’s not linking to anywhere.
This particular situation seems to be ridiculous
This means that with the increase of bots visits the power of passing the link Force falls to…0 (!), so Google was forced to set the defence against this limit. What would it be?
- The limit of Force passing between both pages in Google Index (here are A and B pages),
- The limit of Force passing between mutual links (link A to B, B to A).
And who is the winner?
Looks like the C and D pages are lucky ones. Although Google was forced again to correct the link Force. It’s officially known that:
- Websites linking to nowhere cannot be on the top positions – it’s not possible to fully learn from them, because they just have their own always limited content,
- This situation would cause a paradox – most websites would not link to each other, so where they could gain their own links?

Changes
In its own algorithms Google should implement some of the relevant variables:
- The power of X contributions depending on directions and number of links for particular page (eg. A page),
- The rules of link indexing time to eliminate the situations when bots could be in the same place and in the same time in the whole Internet,
- The limit of force contribution in the cross-linking.
It’s really easy to understand the popularity of Black Hat SEO, especially when we will think about the complicated structure of linking between websites. The Penguin algorithm implementation in the real time is a huge revolution, but it could be pernicious for the well-known Internet. I guess that there are more limits in the force flow (and they are more complicated) than the description of force passing when linking.
Inspired by “The Art of SEO” 3rd Edition E.Enge, S.Spencer, J.C. Stricchiola 2015