Hi All. We're considering trying to improve server performance by filtering out bots that hit the site. We have a list of bots already. Is there a best practice in how to do this most effectively? We obviously want some spiders (e.g. google) to be able to index the content. For example, with regards to documents linked to web pages, would it make sense to block bots from clicking those links? Same with form buttons. Any feedback is greatly appreciated.