Log analyzers are useful to identify common issues on a Perforce server quickly.
For more background information, please see:
https://community.perforce.com/s/article/5470
https://community.perforce.com/s/article/2514
There are 4 main options as shown below.
This is related to log2sql.py
but a little more recent - it builds a compatible Sqlite database.
Please note that it is considerably faster (typically factor of 3-4x) and well worth considering for larger log files.
The build databases are compatible with the Flask web application mentioned below.
For more details, see the go-libp4dlog repository on GitHub. Binary releases for Linux/Mac/Windows are available.
The new kid on the block - similar to log_analyzer.php. Big advantage is that it has a fairly comprehensive test harness and processes all manner of recently observed log file scenarios.
In addition, there is an associated Python Flask-based web app which runs various pre-canned queries and charts. It is setup to run inside a Docker container for ease of isolation of environment.
As with other tools, the following phases occur:
Or p4 clone
from Public depot (requires you to create a free account, but then has advantage you can easily
pull in the latest updates using p4 fetch
).
p4 -u <your-user-name> clone -p public.perforce.com:1666 -f //guest/perforce_software/log-analyzer/psla/...
Alternatively download zip file from the project page:
https://swarm.workshop.perforce.com/projects/perforce-software-log-analyzer/files/psla
Unzip into a directory, say $HOME/log-analyzer and cd into that directory.
You can run with Docker if installed on your host system. Or you can run the app directly using Python (either 2.7 or 3.6) (and Flask) - in this case you need to install the dependencies:
cd psla
pip install -r requirements.txt
Please note that pip can be installed as a package (e.g. python-pip), or directly from https://pip.pypa.io/en/stable/installing/
To run:
./run_psla.sh
or
./run_psla.sh 5005
The first invocation uses default port of 5000.
$ ./run_psla.sh
* Serving Flask app "psla"
* Forcing debug mode on
* Running on http://0.0.0.0:5000/ (Press CTRL+C to quit)
* Restarting with stat
* Debugger is active!
* Debugger PIN: 840-914-035
In your browser open http://localhost:5000/
From "Home" page, cLick link "Upload a New Log"
Then "Choose file" and finally "Upload":
One you have uploaded one or more logs, you select a database (auto named according to the name of your log file):
And by clicking "analyze" it runs various pre-canned SQL queries:
Note that using other links you can either create charts, or you can run interactive queries (with the pre-canned queries provided as options to be selected and modified).
You can run the script manually:
export FLASK_APP=`pwd`/psla.py # Absolute path also possible
export FLASK_DEBUG=1
export PSLA_LOGS=`pwd`/logs # Absolute path required
flask run --host=0.0.0.0 --port=5000
Obviously you can change the port if desired.
With Docker, use the Makefile (first build from scratch will take 5-10 mins and requires good internet connection):
make build
To run the container:
make up
The latter runs the web app on http://localhost:5000 and the obvious tweak to the Dockerfile will run on a different port.
The container can be run manually:
docker run --rm -p=5000:5000 --name=psla -v `pwd`/psla/logs:/logs psla
Note that there is a page to upload a log file into the psla/logs directory inside the container. But the app/container is run with the psla/logs directory under the current directory mounted as /logs inside the container - so files will persist when container is not running. Any generated database(s) will also be put in this same directory.
Especially if you have largeish log files, it is better to run the analysis first manually and put the results in log directory. Then run the container to analyse the results.
So:
cp p4d.log psla/logs
cd psla/logs
../log2sql.py p4d.log
View progress information, and check resulting database (*.db) is created. Example:
$ ../log2sql.py p4d.log
2018-03-28 11:01:47,083:INFO Creating database: p4d.db
2018-03-28 11:01:47,083:INFO Processing p4d.log:
2018-03-28 11:01:47,137:INFO ...0%
2018-03-28 11:01:49,344:INFO ...10%
2018-03-28 11:01:51,096:INFO ...20%
2018-03-28 11:01:52,963:INFO ...30%
2018-03-28 11:01:55,561:INFO ...40%
2018-03-28 11:01:58,133:INFO ...50%
2018-03-28 11:02:00,594:INFO ...60%
2018-03-28 11:02:02,742:INFO ...70%
2018-03-28 11:02:04,194:INFO ...80%
2018-03-28 11:02:06,144:INFO ...90%
2018-03-28 11:02:07,962:INFO ...100%
Check the options available:
log2sql.py -h
Then run docker container or web app as described above.
Navigate to http://localhost:5000/analyseLog select your database name and click analyse.
You can create jobs or view existing jobs for the project:
https://swarm.workshop.perforce.com/projects/perforce-software-log-analyzer/jobs/
This is a PHP script that turns a P4LOG
into a SQLite database
and generate canned reports from it.
The report is useful to identify common performance issues on a Perforce Helix server and further queries can be done against the SQLite database created.
For more information, please refer to the following KB article:
https://community.perforce.com/s/article/1266
A Python script that scans through a P4LOG
file quickly and
generate data suitable for generating graphs.
https://swarm.workshop.perforce.com/downloads/guest/perforce_software/log-analyzer/psla/p4clog.py
# Perforce Server Log Analyzers Log analyzers are useful to identify common issues on a Perforce server quickly. For more background information, please see: https://community.perforce.com/s/article/5470 https://community.perforce.com/s/article/2514 There are 4 main options as shown below. # log2sql - Go executable This is related to `log2sql.py` but a little more recent - it builds a compatible Sqlite database. Please note that it is considerably faster (typically factor of 3-4x) and well worth considering for larger log files. The build databases are compatible with the Flask web application mentioned below. For more details, see the [go-libp4dlog](https://github.com/rcowham/go-libp4dlog) repository on GitHub. Binary releases for Linux/Mac/Windows are available. # log2sql.py The new kid on the block - similar to log_analyzer.php. Big advantage is that it has a fairly comprehensive test harness and processes all manner of recently observed log file scenarios. In addition, there is an associated Python Flask-based web app which runs various pre-canned queries and charts. It is setup to run inside a Docker container for ease of isolation of environment. ## Analysis Phases As with other tools, the following phases occur: * Upload log file * Parse it using tool into Sqlite database * Run SQL queries against that database ## Clone project (using p4 dvcs) Or `p4 clone` from Public depot (requires you to create a free account, but then has advantage you can easily pull in the latest updates using `p4 fetch`). p4 -u <your-user-name> clone -p public.perforce.com:1666 -f //guest/perforce_software/log-analyzer/psla/... ## Download Project Alternatively download zip file from the project page: https://swarm.workshop.perforce.com/projects/perforce-software-log-analyzer/files/psla Unzip into a directory, say $HOME/log-analyzer and cd into that directory. ## Running the web app You can run with Docker if installed on your host system. Or you can run the app directly using Python (either 2.7 or 3.6) (and Flask) - in this case you need to install the dependencies: cd psla pip install -r requirements.txt Please note that pip can be installed as a package (e.g. python-pip), or directly from https://pip.pypa.io/en/stable/installing/ To run: ./run_psla.sh or ./run_psla.sh 5005 The first invocation uses default port of 5000. ```bash $ ./run_psla.sh * Serving Flask app "psla" * Forcing debug mode on * Running on http://0.0.0.0:5000/ (Press CTRL+C to quit) * Restarting with stat * Debugger is active! * Debugger PIN: 840-914-035 ``` In your browser open http://localhost:5000/ ## Uploading and parsing your first log From "Home" page, cLick link "Upload a New Log" ![First page](psla/images/home_page.png) Then "Choose file" and finally "Upload": ![Upload](psla/images/upload.png) One you have uploaded one or more logs, you select a database (auto named according to the name of your log file): ![Upload](psla/images/select_database.png) And by clicking "analyze" it runs various pre-canned SQL queries: ![Upload](psla/images/analyze.png) Note that using other links you can either create charts, or you can run interactive queries (with the pre-canned queries provided as options to be selected and modified). ## Running manually You can run the script manually: ```bash export FLASK_APP=`pwd`/psla.py # Absolute path also possible export FLASK_DEBUG=1 export PSLA_LOGS=`pwd`/logs # Absolute path required flask run --host=0.0.0.0 --port=5000 ``` Obviously you can change the port if desired. ## Running in Docker With Docker, use the Makefile (first build from scratch will take 5-10 mins and requires good internet connection): make build To run the container: make up The latter runs the web app on http://localhost:5000 and the obvious tweak to the Dockerfile will run on a different port. The container can be run manually: docker run --rm -p=5000:5000 --name=psla -v `pwd`/psla/logs:/logs psla Note that there is a page to upload a log file into the psla/logs directory inside the container. But the app/container is run with the psla/logs directory under the current directory mounted as /logs inside the container - so files will persist when container is not running. Any generated database(s) will also be put in this same directory. ## Manually analysing the logs Especially if you have largeish log files, it is better to run the analysis first manually and put the results in log directory. Then run the container to analyse the results. So: cp p4d.log psla/logs cd psla/logs ../log2sql.py p4d.log View progress information, and check resulting database (*.db) is created. Example: $ ../log2sql.py p4d.log 2018-03-28 11:01:47,083:INFO Creating database: p4d.db 2018-03-28 11:01:47,083:INFO Processing p4d.log: 2018-03-28 11:01:47,137:INFO ...0% 2018-03-28 11:01:49,344:INFO ...10% 2018-03-28 11:01:51,096:INFO ...20% 2018-03-28 11:01:52,963:INFO ...30% 2018-03-28 11:01:55,561:INFO ...40% 2018-03-28 11:01:58,133:INFO ...50% 2018-03-28 11:02:00,594:INFO ...60% 2018-03-28 11:02:02,742:INFO ...70% 2018-03-28 11:02:04,194:INFO ...80% 2018-03-28 11:02:06,144:INFO ...90% 2018-03-28 11:02:07,962:INFO ...100% Check the options available: log2sql.py -h Then run docker container or web app as described above. Navigate to http://localhost:5000/analyseLog select your database name and click analyse. ## Reporting problems or requesting enhancements You can create jobs or view existing jobs for the project: https://swarm.workshop.perforce.com/projects/perforce-software-log-analyzer/jobs/ # log_analyzer.php This is a PHP script that turns a `P4LOG` into a SQLite database and generate canned reports from it. The report is useful to identify common performance issues on a Perforce Helix server and further queries can be done against the SQLite database created. For more information, please refer to the following KB article: https://community.perforce.com/s/article/1266 # p4clog.py A Python script that scans through a `P4LOG` file quickly and generate data suitable for generating graphs. https://swarm.workshop.perforce.com/downloads/guest/perforce_software/log-analyzer/psla/p4clog.py
# | Change | User | Description | Committed | |
---|---|---|---|---|---|
#8 | 29553 | Pascal Soccard | Removed KB broken link and add direct link to the p4clog.py script | ||
#7 | 26299 | Robert Cowham | Mention Go version of log2sql | ||
#6 | 25527 | Robert Cowham | Minor clarifications | ||
#5 | 25526 | Robert Cowham |
Add shell script to run easily. Enhance README |
||
#4 | 25221 | Robert Cowham | Fix links for new home in workshop | ||
#3 | 25217 | Robert Cowham | Add readme again | ||
#2 | 25215 | Robert Cowham | Remove psla branch | ||
#1 | 25213 | Robert Cowham | Populate -o //guest/perforce_software/utils/log_analyzer/README.md //guest/perforce_software/log-analyzer/psla/README.md. | ||
//guest/perforce_software/utils/log_analyzer/README.md | |||||
#7 | 24979 | Robert Cowham |
Tidy up charting to add db selection and titles. Clarify README |
||
#6 | 23835 | Robert Cowham | New queries from P4RA. | ||
#5 | 23833 | Robert Cowham | Document how to run with flask and make /logs configurable | ||
#4 | 23795 | Robert Cowham | More explanations | ||
#3 | 23794 | Robert Cowham | Add notes | ||
#2 | 23764 | Robert Cowham | Basic notes for log2sql.py in README | ||
#1 | 15082 | Lester Cheung | README for the log analyzers. |