Sha256: f1ed1e7281c686234bcf75e3e90281d2a1abecfb6143f0f3942f22f11a623dc4

Contents?: true

Size: 1.65 KB

Versions: 1

Compression:

Stored size: 1.65 KB

Contents

TODO items for Rails-log-analyzer
=================================
Contact willem AT vanbergen DOT org if you want to help out with the development.

General:
- Add more tests / specs

Datamining:
- Add query functionality for the resulting database file (interactive reports?)
- Link request processing line to request completed line (VirtualMongrel?)
- Fix the database inserter and make it more robust for future changes

Rails integration:
- Optionally use local or specific routes.rb file to parse URLs

Other:
- World domination





Datamining should look something like this:

> request-log-analyzer myapp.log --interactive
Request log analyzer builds a new database.
Columns come from the log_parser as the LOG_LINES store all the keys that they can detect.
Also add some extra columns like hashed_request_url etc.

Request log analyzer then parses the logfile for its individual requests using something like the 
virtual mongrel (we need a new name for this, database_summarizer agregator? sheepdog?) combined with our
default log parser.

When this is done the user enters an interactive mode (like irb).
> Filters: None
> Total requests in database: 53232
> $

The user can add filters like this:
> $ FILTER SQL ["date > ?", Date.today-1]

The user will then see this:
> Filters:
>  1. ["date > ?", Date.today-1]
> Total requests: 2120
> $

At any point the user can destroy filters, show the raw requests or show reports
> $ REPORT ALL

The request remaining after the filter chain will then be processed through the summarizer and then trough the
output templates, generating reports specificly for the selected dataset.
Partials should also be possible
> $ REPORT TIMESPAN

Version data entries

1 entries across 1 versions & 1 rubygems

Version Path
wvanbergen-request-log-analyzer-0.3.3 TODO