Submit Your Article Forum Rules

Page 2 of 9 FirstFirst 1234 ... LastLast
Results 11 to 20 of 85

Thread: Don't Hoard Your Google PageRank

  1. #11
    WebProWorld MVP ronniethedodger's Avatar
    Join Date
    Aug 2003
    Posts
    1,400
    Quote Originally Posted by Dave Hawley
    I think not! Just take a look at Microsoft.com, Apple.com and many other > PR8 sites out there. In fact, it would appear the complete opposite is true.
    What gives you the impression that either of these sites are "dead ended" or "PageRank sinks" as they are being called?

    To heck with Apple, I have my own opinion about that site (ask Mel). But Microsoft has thousands of of outbound links and using them as an example of being a deadended pr sinked site is just not true. They have hundreds of third-party software Companies that they are linking out to. These Companies offer a wide range of add-ons and plug-ins that support other Microsoft products. Technical papers that often cite and link to hardware manufacturers.

    Come back with another example, those two you picked just don't cut the mustard in that area. ;0)

  2. #12
    WebProWorld MVP williamc's Avatar
    Join Date
    Jul 2003
    Location
    On a really big hill in Kentucky
    Posts
    4,538
    The only fault I had with the article is that garrett was stating things as "fact". As if they were here and now. They are not, either by M$ or Google.

    Matt Cuts does not need to tell me that google is trying to make gbot understand JS. I already know that just from seeing googlebot-test in my logs. And for your information there has been some evidence that googlebot does indeed see some javascript links already. Does it penalize for them? No. Why? Because google likes to handle things via it's own algo. That means it will teach gbot to read the links and follow them. If you research, test, and know the facts, it is easy to see this. If you do not, and rely on message board postings, then it must be a rose colored world to be ignorant.
    William Cross
    Web Development by Those Damn Coders
    Firearm Friendly Websites because our constitution matters

  3. #13
    WebProWorld MVP
    Join Date
    Jul 2003
    Posts
    1,931
    Quote Originally Posted by ronniethedodger
    Quote Originally Posted by Mel
    Its all well and good for Dan to suggest what Googles likes and dislikes are but that does not make them facts and they should
    For one, he did not report them as facts but rather the opinions of others. He used multiple sources to arrive at an opinion of his own.
    Please reread Ronnie I am talking about DAN

    No where in there did I see Garrett suggest or imply that it was actual fact. I read his post as an opinion of his and he has a right just like you or I to offer those opinions by posting it to this forum.
    Quote Originally Posted by Garrett

    If you're using this tactic on your site you're liable to be penalized...

    The other reason you should continue linking is that Google also penalizes "dead end" sites or "PageRank Sinks,"
    That seems pretty clear cut to me, not an opinion but stated as a fact by the Admin.

    Quote Originally Posted by Mel
    This forum should hold to better standards than this.
    Yes they should, and they do. I don't know what got your panties in a bunch....
    And if you want more opinion....
    This is no different than Google ferreting out white text on white background colored links. The JavaScript links are another form of spoofing the spider into thinking something that it really is not.
    What got my panties in a bunch (as you so elegantly put it Ron) is people conveying as facts ideas which are at best opinions, and not offering any substantiation at all to back them up.

    If you think that better standards should not be upheld and that half baked opionions should be offered as facts with no one challenging them, thats fine but I would like to see better. YMMV


    You can choose to ignore it and just pass it off as another "unsubstantiated theory" standard line of thinking that I keep hearing from a lot of folk, but if you do so on this one...then you are not as smart as I once gave you credit for.
    Unless and until someone can show at least one tiny example that shows this in action then it is still an "unsubstantiated opinion" (IMO of course) Ron and should not be reported as facts. This has always been my position and will continue to be.

  4. #14
    Senior Member
    Join Date
    Jul 2003
    Posts
    978
    Quote Originally Posted by williamc
    The only fault I had with the article is that garrett was stating things as "fact". As if they were here and now. They are not, either by M$ or Google.

    Matt Cuts does not need to tell me that google is trying to make gbot understand JS. I already know that just from seeing googlebot-test in my logs. And for your information there has been some evidence that googlebot does indeed see some javascript links already. Does it penalize for them? No. Why? Because google likes to handle things via it's own algo. That means it will teach gbot to read the links and follow them. If you research, test, and know the facts, it is easy to see this. If you do not, and rely on message board postings, then it must be a rose colored world to be ignorant.
    Sure thing... another truth... Google is interested in only making its archive the best, which is different from manipulation to appear the best.

    The inference therefore suggests that "sinkhole practices" will eventually have a similar effect as duplicated content, same color text/background, and hidden links...

    Google frowns on these practices - right or wrong and then why?

    On the other hand many sites are totally designed in JavaScript and completely ignorant to today's SE limitations (thus additional incentive for Google to perfect JavaScript readability) but who is fooling who here... do you honestly believe that Google isn't interest in spam reduction, at the same time or appreciates that we "manipulate" it's creations to benefit ourselves (and clients) along with Google benefiting itself... It's quite arrogant to believe that Google doesn't care - and therefore won't stem the tide - since they have repeatly shown in the past that manipulation without substance is bad for their business.

    Like many, I ride the edge... but I am not naive enough to believe that the edge of today will be the same as the edge of tomorrow and therefore maintain a stance on knowing what is changing to be certain I don't end up in shit creek... I really don't believe you would rather be in the creek first before seeking alternatives... but I could be wrong.
    New daily advice on Advance SEO, Copyright & DMCA @ Twitter

  5. #15
    WebProWorld MVP williamc's Avatar
    Join Date
    Jul 2003
    Location
    On a really big hill in Kentucky
    Posts
    4,538
    Fathom, I actually quite agree with most everything you said. But on the other hand, google has always shown that it prefers to deal with most things in it's algo. Knowing the way they have dealt with other things similar leads me to believe that they are teaching their bot to read javascript not to punish, but to make their listings as relevant as they can be. I do not see them penalizing for javascript links when a great many large sites use them routinely for very valid reasons, just to stop a few people from using them to hoard PR.

    Someone above mentioned that the javascript links were used to redirect, etc, etc. I am still wondering where this came from as we were talking about javascript *links* and not javascript redirects or shadow games.
    But then, not everyone can read that well, so I'll let that one slide.
    William Cross
    Web Development by Those Damn Coders
    Firearm Friendly Websites because our constitution matters

  6. #16
    WebProWorld MVP ronniethedodger's Avatar
    Join Date
    Aug 2003
    Posts
    1,400

    Re: Don't Hoard Your Google PageRank

    For the record, for people who cannot read the King's English.

    Quote Originally Posted by Garrett
    If you've been using JavaScripts on outbound links from your webpages in an effort to preserve your PageRank you should quit - for two reasons.
    The word should implies that it is a suggestion and something that is not written in stone. He is specifically talking about JavaScripted links that are designed for one purpose....and that is deception. If you are not using them for this purpose then ignore the rest of my reply.

    He then gave two reasons why he feels this way. Both reasons have multiple sources to support his opinion. I am calling it an opinion, because his opening paragraph does not call it a fact. When you piece together bits and pieces of information, then you come up with these opinions or theories -- one of them by themself is nothing, but two or three that support a similar line of thinking and reasoning raise the possiblity of it's existance to be true or at worst a very good possibility.

    Nobody seems to dispute this part about Google wanting to parse JavaScript. It is a necessary step for Google to overcome if they want to survive against Microsoft (this is my opinion) who can already parse javascript and vbscript from an ordinary Html page or external .js file.

    Garrett stops short of saying that they are potential red flags to the Google Engineer...he uses the word potential. Again you guys are unjustly quoting him as stating it as fact and reporting it to be fact (get out those dictionaries boys). The ensuing paragraphs are in support of this statement which uses the word "potential".

    People using any tactics designed to spoof the bots have resulted in Google finding a way to sniff it out. It is a never ending game, much like the virus game is played. I cannot see no reason why the method of JavaScripted links will be no different to them in this regard. I don't know....call this one a gut feeling. But if you tend not to believe it, then you are severly underestimating Google.

    Again though Garrett is only referring to deceptive means of preserving PageRank by hiding outbound links. He does hint about other techniques that may come under scrutiny when he mentions "aggresive" SEO tactics.....but let's not cloud the issue. Not unless this is somewhere you want to go with this discussion, cuz if you think I am blowing wind up your skirt on the links issue then you don't want to get me started on the other forms of deception that goes far beyond the linking issue. This after all is a thread about PageRank.

    Quote Originally Posted by Mel
    Unless and until someone can show at least one tiny example that shows this in action then it is still an "unsubstantiated opinion" (IMO of course) Ron and should not be reported as facts. This has always been my position and will continue to be.
    What is this Mel? Do you have that stock line in a cut & paste buffer or something? "Unless someone can show me...."

    It was an opinion -- plain and simple. With Google getting into stemming, word symantecs, and soon will be able to actually read content on the written page -- I think you had better take an English course and catch up on reading it yourself so you will be able to write it too.

    What you are saying is that you can smell the smoke. But until the fire comes and lights your ass on fire, then you are not going to do anything about it. You would rather all of us sit here and smell the smoke with you. Sorry Mel, I am getting into another vehicle -- and one that isn't stalled on the railroad track with the signal flashing and bells ringing.

    You would also just like to stay even with the game, and not one step ahead. No forethought, no planning for possible scenarios in case you get zinged by something right out of the blue. This is not my line of thinking Mel, it often appears to be yours though.

    Garrett's opinion offers another observation in the grander scheme of things and it goes beyond this trivial little spat we are having about JavaScripted linking. It is clearly showing you that the SE game is changing technologically. I can see this and feel it like I am a part of it. I have said it before and I will say it again, with Microsoft stepping into the ring it will explode into something that you cannot even begin to even fathom.

    And speaking of Fathom, his mentioning of "playing it safe" is another opinion that is worth heeding. This thing of javascripted links is a serious issue, and one not to be trifled with. The capability of reading and interpreting them is there -- and that is a fact Mel.

  7. #17
    Senior Member
    Join Date
    Jul 2003
    Posts
    978
    Quote Originally Posted by williamc
    google has always shown that it prefers to deal with most things in it's algo. Knowing the way they have dealt with other things similar leads me to believe that they are teaching their bot to read javascript not to punish, but to make their listings as relevant as they can be.
    Valid point - agree.

    I do not see them penalizing for javascript links when a great many large sites use them routinely for very valid reasons, just to stop a few people from using them to hoard PR.
    Again valid point as Google does not wish to penalize without just cause. At the same time I'll use a valid reason to use duplicate content to illustrate.

    A nationwide service had a website that supported 15,000 cities and town in United States. They had thought to develop satellite sites for each center but read that heavy crosslinking was bad, and that Google had preference for large websites (both of these assumptions are somewhat accurate and in other ways inaccurate depending on how this information is used).

    They rationalized that each center should have its own pages of content and organize in their now quite large website by state. A fair assessment... however as one center in precisely the same as the next, content diversity was limited to mostly city/state info.

    While they were familar with "duplicate content" penalities (something else they read) they also knew that Google defines results success mostly by "building websites for the user and "the key staement" as if Google and other search engines didn't exist

    They rationalized that a person in Tampa, Florida would not visit nor be interest in content for New York, New York and with Google's own guidelines in hand cleverly thought a New Yorker wouldn't search for a Tampa center either (Google doesn't exist) so they set out and developed 15,000 pages.

    They did - precisely what Google suggested (and using expert advice - most of which was "out of context") and got a major penalty for their effort!

    The short of it: (and back to your statement) just to stop a few people from using them to hoard PR it's not a few - it's a problem... and we can rationalize this many different ways... but the only rationalization that matters is Googles'.

    <added>In context of your thoughts on JavaScript... were they attempting to beat Google at their own game (ranking wise) or attempting to follow Google's guidelines and expert advice?<added>

    Someone above mentioned that the javascript links were used to redirect, etc, etc. I am still wondering where this came from as we were talking about javascript *links* and not javascript redirects or shadow games.
    But then, not everyone can read that well, so I'll let that one slide.
    Ya - just like the example - eh! :-)

    The biggest problem is "none" of us has the time to write a book, on all the specifics on both sides and "none" would read it verbatim if we did, plus by the time it's published it's dated info.

    Notwithstanding the "forum" is the best resource for stuff like this as the dialogue isn't just a passive read.
    New daily advice on Advance SEO, Copyright & DMCA @ Twitter

  8. #18
    Senior Member
    Join Date
    Jul 2003
    Posts
    978
    Just thought I would add - if a website is completely developed with internal JavaScript links it is a design flaw that Google will address...

    on the other hand if all internal links are standard (or most) and all external links are JavaScript (or most) what is the inference here... different?

    How difficult would it be for Google to intrepret this difference?

    That last part is the key... how does Google rationize the pattern difference?
    New daily advice on Advance SEO, Copyright & DMCA @ Twitter

  9. #19
    Senior Member
    Join Date
    Jul 2003
    Posts
    978
    Garrett wrote:

    If you're using this tactic on your site you're liable to be penalized...

    The other reason you should continue linking is that Google also penalizes "dead end" sites or "PageRank Sinks,"
    A note:

    liable - At risk of or subject to experiencing or suffering something unpleasant...

    that is quite a bit different from,

    fact - Knowledge or information based on real occurrences.

    Seems Garrett is quite clever... he can make people see whatever they want to see, and discuss their views by questioning "facts" not written... the power of written suggestion... he's very sophisticated! ;-)
    New daily advice on Advance SEO, Copyright & DMCA @ Twitter

  10. #20
    WebProWorld MVP
    Join Date
    Jul 2003
    Posts
    1,931
    hmmm.... There are non so blind as those who will not see.

    The words "subject to" do not offer alternatives IMO it means that is what is going to happen.

    You next quote Garrett as flat out saying that google does penalize sink hole sites.

    I guess we understand English in a different way, but if the objective of the forum is to educate and not mislead then I suggest again that such unsubstantiated speculations should not be stated as if they are facts, since they are liable to be interpreted just as they are written, without requireing a hundred lines of supporting posts to interpret them.

Similar Threads

  1. Google PageRank
    By dis-india in forum Google Discussion Forum
    Replies: 2
    Last Post: 10-24-2005, 08:11 AM
  2. Looking up Google pagerank?
    By pdstein in forum Google Discussion Forum
    Replies: 1
    Last Post: 09-08-2005, 02:19 PM
  3. Google Pagerank
    By ikgrauke in forum Google Discussion Forum
    Replies: 10
    Last Post: 09-04-2005, 05:01 PM
  4. Google pagerank from 5 to 0. Can you help?
    By Wildflowerdyes in forum Google Discussion Forum
    Replies: 3
    Last Post: 12-28-2004, 03:13 PM
  5. Google PageRank and How to Get It
    By compar in forum Google Discussion Forum
    Replies: 6
    Last Post: 06-02-2004, 02:37 AM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •