[BlindTlk] further explanations.
Lloyd Rasmussen
lras at sprynet.com
Wed Jan 9 12:11:47 UTC 2019
Many websites have a file called robots.txt which designates URLs that they
don't want search engines to index. If this is what you are running into, I
think you would need to deal with the operators of those sites regarding
information you wanted to find.
Lloyd Rasmussen, Kensington, MD
http://lras.home.sprynet.com
-----Original Message-----
From: Justin Williams via BlindTlk
Sent: Wednesday, January 09, 2019 3:01 AM
To: 'Blind Talk Mailing List'
Cc: Justin Williams
Subject: [BlindTlk] further explanations.
I meant to say also that I'm getting the this is a robot error, or some
such.
_______________________________________________
BlindTlk mailing list
BlindTlk at nfbnet.org
http://nfbnet.org/mailman/listinfo/blindtlk_nfbnet.org
To unsubscribe, change your list options or get your account info for
BlindTlk:
http://nfbnet.org/mailman/options/blindtlk_nfbnet.org/lras%40sprynet.com
More information about the BlindTlk
mailing list