61
Mods & Plugins (Releases & Support) / Re: [MOD] Treat bots as users with less rights
« on: May 19, 2009, 06:40:51 AM »
I think I was replying to the wrong thread about this issue... so Ill move my questions to here
Quote
Im a little confused; it seems that this mod you mentions only controls the sections of the site which the robots are allowed to view... I know I can keep them from sepere *.php files through no follow...
But I am understand correctly, session ID's cause errors when indexed by google. they will create links with session ID embedded, which create a 404 not found error.
Your telling me, that bots with this mod will not have a session ID, because they are members...????
But dont members even have session ID's?
If so, why is it the members session ids wont be indexed?
Or are you saying that members have no session ID's , so every url followed will be clean/ without a session ID?
If so, why go through the trouble of adding to every single link of your website is a bot/ not a bot, and just block the unwanted with robots.txt, and simply require the robot be a member?
However the internal member management works, let it allow the bot to do everything. As I understand the robots.txt is a seperate form of robot management, which will overide the allow all of the internal member management, and restrict the robot to your no follow commands.
In the end the benefit of the mod is no user session id indexed.
All assuming members dont use user session id's.
Thats the big question.. how can users not use session ids, but guests do? Im guessing the session id controlls like when the user should log in and what they can see... which is stored in a cookie I guess? So that means no cookie members would be allowed to be logged in forever, somehow tracked by current user ip; or the oppisite they would be asked to log in with every single click because the system can not determin who they are because there is no cookie to register the user info in?
If it is recorded by user ip, why use cookies at all? Because of the chance to different users might use the same IP? Well, if they had no cookie to tell the difference and went by ip address... There would be nothing keeping one user safe from the other unless he specifically logged out, and then it would rewcord the ip as logged out?
Why not simply slip the cookie process and make mandatory 30 minute log outs for each user?
I know this is confusing... if you could respond to each section inline it would be easier to follow.
I just dont understand the logic of that mod , going through every single link saying what robots can and cant do, instead of foing it much more simply through robots.txt.
IF allowing the robot to be listed as a user, wouldnt the robots.txt act as a way for the robot to govern itself by following the robots.txt map, rather than by the ennifficient way of adding mark up to each and every link?
I could understanf using the if robot tage to SHOW information to the robots no one else can see... however why use it vs. the robots.txt to hide it?
All I want to do is kill the session id from the index...
Is there away to kill it and allow the robot to view everything but govern itself through robots txt, allowing a golden map of session free indexing?
THanks.. sorry this is so confusing.