Log analyzers are useful to identify common issues on a Perforce server quickly.
For more background information, please see:
There are 4 main options as shown below.
This is related to
log2sql.py but a little more recent - it builds a compatible Sqlite database.
Please note that it is considerably faster (typically factor of 3-4x) and well worth considering for larger log files.
The build databases are compatible with the Flask web application mentioned below.
For more details, see the go-libp4dlog repository on GitHub. Binary releases for Linux/Mac/Windows are available.
The new kid on the block - similar to log_analyzer.php. Big advantage is that it has a fairly comprehensive test harness and processes all manner of recently observed log file scenarios.
In addition, there is an associated Python Flask-based web app which runs various pre-canned queries and charts. It is setup to run inside a Docker container for ease of isolation of environment.
As with other tools, the following phases occur:
p4 clone from Public depot (requires you to create a free account, but then has advantage you can easily
pull in the latest updates using
p4 -u <your-user-name> clone -p public.perforce.com:1666 -f //guest/perforce_software/log-analyzer/psla/...
Alternatively download zip file from the project page:
Unzip into a directory, say $HOME/log-analyzer and cd into that directory.
You can run with Docker if installed on your host system. Or you can run the app directly using Python (either 2.7 or 3.6) (and Flask) - in this case you need to install the dependencies:
cd psla pip install -r requirements.txt
Please note that pip can be installed as a package (e.g. python-pip), or directly from https://pip.pypa.io/en/stable/installing/
The first invocation uses default port of 5000.
$ ./run_psla.sh * Serving Flask app "psla" * Forcing debug mode on * Running on http://0.0.0.0:5000/ (Press CTRL+C to quit) * Restarting with stat * Debugger is active! * Debugger PIN: 840-914-035
In your browser open http://localhost:5000/
From "Home" page, cLick link "Upload a New Log"
Then "Choose file" and finally "Upload":
One you have uploaded one or more logs, you select a database (auto named according to the name of your log file):
And by clicking "analyze" it runs various pre-canned SQL queries:
Note that using other links you can either create charts, or you can run interactive queries (with the pre-canned queries provided as options to be selected and modified).
You can run the script manually:
export FLASK_APP=`pwd`/psla.py # Absolute path also possible export FLASK_DEBUG=1 export PSLA_LOGS=`pwd`/logs # Absolute path required flask run --host=0.0.0.0 --port=5000
Obviously you can change the port if desired.
With Docker, use the Makefile (first build from scratch will take 5-10 mins and requires good internet connection):
To run the container:
The latter runs the web app on http://localhost:5000 and the obvious tweak to the Dockerfile will run on a different port.
The container can be run manually:
docker run --rm -p=5000:5000 --name=psla -v `pwd`/psla/logs:/logs psla
Note that there is a page to upload a log file into the psla/logs directory inside the container. But the app/container is run with the psla/logs directory under the current directory mounted as /logs inside the container - so files will persist when container is not running. Any generated database(s) will also be put in this same directory.
Especially if you have largeish log files, it is better to run the analysis first manually and put the results in log directory. Then run the container to analyse the results.
cp p4d.log psla/logs cd psla/logs ../log2sql.py p4d.log
View progress information, and check resulting database (*.db) is created. Example:
$ ../log2sql.py p4d.log 2018-03-28 11:01:47,083:INFO Creating database: p4d.db 2018-03-28 11:01:47,083:INFO Processing p4d.log: 2018-03-28 11:01:47,137:INFO ...0% 2018-03-28 11:01:49,344:INFO ...10% 2018-03-28 11:01:51,096:INFO ...20% 2018-03-28 11:01:52,963:INFO ...30% 2018-03-28 11:01:55,561:INFO ...40% 2018-03-28 11:01:58,133:INFO ...50% 2018-03-28 11:02:00,594:INFO ...60% 2018-03-28 11:02:02,742:INFO ...70% 2018-03-28 11:02:04,194:INFO ...80% 2018-03-28 11:02:06,144:INFO ...90% 2018-03-28 11:02:07,962:INFO ...100%
Check the options available:
Then run docker container or web app as described above.
Navigate to http://localhost:5000/analyseLog select your database name and click analyse.
You can create jobs or view existing jobs for the project:
This is a PHP script that turns a
P4LOG into a SQLite database
and generate canned reports from it.
The report is useful to identify common performance issues on a Perforce Helix server and further queries can be done against the SQLite database created.
For more information, please refer to the following KB article:
A Python script that scans through a
P4LOG file quickly and
generate data suitable for generating graphs.