"I talked to a lot of SEO specialists from big enterprises about their business and their answers differed.
These companies have different opinions on the reason why they reject links.
But with more websites implementing push notification feature, Googlebot developers are on the point of adding support for HTTP in future.” It should be recalled that in April 2016, John Mueller said that the use of the HTTP / 2 protocol on the website does not directly affect the ranking in Google, but it improves the experience of users due to faster loading speed of the pages.
Therefore, if you have a change, it is recommended to move to this protocol.
They are not necessary for many website owners and it is better to spend this time on improving the website itself, says Slagg.
Oct 08/2017 During the last video conference with webmasters Google rep called John Mueller said that Googlebot still refrains to scan HTTP.
As for the report processing time, it takes some considerable time.
As Mueller explained, taking measures may take "some time", but not a day or two.
I have it for 4 years already and I do not have a file named Disavow. Thus, in the case when before a website owner was engaged in buying links or using other prohibited methods of link building, then conducting an audit of the reference profile and rejecting unnatural links is necessary in order to avoid future manual sanctions.Oct 08/2017 Google employee named John Mueller stated that the search team does not check all spam reports manually during the last video conference with webmasters.The question to Mueller was the following: "Some time ago we sent a report on a spam, but still have not seen any changes. " The answer was: No, we do not check all spam reports manually." Later Mueller added: "We are trying to determine which reports about spam have the greatest impact, it is on them that we focus our attention and it is their anti-spam team that checks manually, processes and, if necessary, applies manual sanctions.Most of the other reports that come to us is just information that we collect and can use to improve our algorithms in the future.