.NET程序屏蔽一些无效蜘蛛的访问配置
1 <rule name="Block spider"> 2 <match url="(^robots.txt$)" ignoreCase="false" negate="true" /> 3 <conditions> 4 <add input="{HTTP_USER_AGENT}" pattern="Apache-HttpClient|SemrushBot|Webdup|AcoonBot|AhrefsBot|Ezooms| EdisterBot|EC2LinkFinder|jikespider|Purebot|MJ12bot|WangIDSpider|WBSearchBot|Wotbox|xbfMozilla|Yottaa|YandexBot| Jorgee|SWEBot|spbot|TurnitinBot-Agent|curl|perl|Python|Wget|Xenu|ZmEu|facebookexternalhit" ignoreCase="true" /> 5 </conditions> 6 <action type="AbortRequest" /> 7 </rule>
记录一下,简单测试可用
作者:uxinxin
本文版权归作者和博客园共有,欢迎转载,但必须给出原文链接,并保留此段声明,否则保留追究法律责任的权利。