README.md in browser-2.7.1 vs README.md in browser-3.0.0
- old
+ new
@@ -56,11 +56,12 @@
# Get bot info
browser.bot.name
browser.bot.search_engine?
browser.bot?
-Browser::Bot.why?(ua) # shows which user agent was the offender
+browser.bot.why? # shows which matcher detected this user agent as a bot.
+Browser::Bot.why?(ua)
# Get device info
browser.device
browser.device.id
browser.device.name
@@ -144,25 +145,25 @@
- For a list of device detections, check [lib/browser/device.rb](https://github.com/fnando/browser/blob/master/lib/browser/device.rb)
- For a list of bot detections, check [bots.yml](https://github.com/fnando/browser/blob/master/bots.yml)
### What defines a modern browser?
-The current rules that define a modern browser are pretty loose:
+The current rules that define a modern browser are pretty loose.
-* Webkit
-* IE9+
-* Microsoft Edge
-* Firefox 17+
-* Firefox Tablet 14+
-* Opera 12+
+* Chrome 65+
+* Safari 10+
+* Firefox 52+
+* IE11+
+* Microsoft Edge 39+
+* Opera 50+
You can define your own rules. A rule must be a proc/lambda or any object that implements the method === and accepts the browser object. To redefine all rules, clear the existing rules before adding your own.
```ruby
-# Only Chrome Canary is considered modern.
+# Only Google Chrome 79+ is considered modern.
Browser.modern_rules.clear
-Browser.modern_rules << -> b { b.chrome? && b.version.to_i >= 37 }
+Browser.modern_rules << -> b { b.chrome? && b.version.to_i >= 79 }
```
### Rails integration
Just add it to the Gemfile.
@@ -211,11 +212,11 @@
language.name
#=> "English/United States"
```
-Result is always sorted in quality order from highest -> lowest. As per the HTTP spec:
+Result is always sorted in quality order from highest to lowest. As per the HTTP spec:
- omitting the quality value implies 1.0.
- quality value equal to zero means that is not accepted by the client.
### Internet Explorer
@@ -264,13 +265,32 @@
#=> true
```
### Bots
-Browser used to detect empty user agents as bots, but this behavior has changed. If you want to bring this detection back, you can activate it through the following call:
+The bot detection is quite aggressive. Anything that matches at least one of the following requirements will be considered a bot.
+- Empty user agent string
+- User agent that matches `/crawl|fetch|search|monitoring|spider|bot/`
+- Any known bot listed under [bots.yml](https://github.com/fnando/browser/blob/master/bots.yml)
+
+To add custom matchers, you can add a callable object to `Browser::Bot.matchers`. The following example matches everything that has a `externalhit` substring on it. The bot name will always be `General Bot`.
+
```ruby
-Browser::Bot.detect_empty_ua!
+Browser::Bot.matchers << ->(ua, _browser) { ua =~ /externalhit/i }
+```
+
+To clear all matchers, including the ones that are bundled, use `Browser::Bot.matchers.clear`. You can re-add built-in matchers by doing the following:
+
+```ruby
+Browser::Bot.matchers += Browser::Bot.default_matchers
+```
+
+To restore v2's bot detection, remove the following matchers:
+
+```ruby
+Browser::Bot.matchers.delete(Browser::Bot::KeywordMatcher)
+Browser::Bot.matchers.delete(Browser::Bot::EmptyUserAgentMatcher)
```
### Middleware
You can use the `Browser::Middleware` to redirect user agents.