Hi,
I did search but didn't find an answer so here we go:
Is there a known upper limit to the log file size that can gracefully be handled by LogMX?
I just started trying out the evaluation version of 7.8.0 using Java 8.0.211 (64 bit) on a MacBook Pro.
The log file I'm interested in analysing is about 350 K lines / 57 Mb, but that fails to load.
Reducing the scope to first 60000 lines work but slowly, 100k seems quite a challenge.
I did manage to create a working RE parser, but I am interested if I do something wrong or
if I hit an inherent tool limitation.
(Yes that file is large, complex infrastructure setup log in debug mode...)
Example line:
2019-05-23 02:02:09,559 INFO [scheduler-2] CommandLine:365 - Running command: git submodule update --init --force --recursive
Regards
/Anders
Log file size limitations
Moderator: admin
Re: Log file size limitations
Hello,
I would actually consider a 60 MB log file rather small. There is no file size limitation. But there are 2 cases:
For case #2, you don't usually have to worry about memory, unless you want to keep all the log events in memory (i.e. no limit on the AutoRefresh setting "max. number of entries").
But since you mentioned that it works "slowly" with 60,000 lines, I guess memory is not your only concern. And since you also mentioned that you're using a RE Parser (Regular Expression I guess), I think the Regular Expression you are using is not efficient. I created a 60 MB log file with the log example you gave, and the RE parser I created loaded the file in less than a second (also, it uses 280 MB of memory, so you shouldn't have memory issues).
Here is the Regexp I used:
One of the common mistakes when building a regexp is using the greedy quantifier .* when the reluctant quantifier .*? could/should be used instead. Notice how I used the greedy one only once: at the end, to capture ALL the remaining characters in the "Message" field. All the other quantifiers are "fixed" like \d{4} or "reluctant" like .*? or \d+?
To learn more about Regexp (and more precisely for LogMX), you can have a look at: https://logmx.com/docs/regex-parsers.html
Let me know if you still have trouble to load your log files (usually you shouldn't have any performance/memory issues with files smaller than 100 MB).
Xavier
I would actually consider a 60 MB log file rather small. There is no file size limitation. But there are 2 cases:
- You want to load all the log events in memory to see them in the graphical user interface.
- You want to monitor only the end of a log file being updated in real-time (and maybe get alerts if some events happen).
For case #2, you don't usually have to worry about memory, unless you want to keep all the log events in memory (i.e. no limit on the AutoRefresh setting "max. number of entries").
But since you mentioned that it works "slowly" with 60,000 lines, I guess memory is not your only concern. And since you also mentioned that you're using a RE Parser (Regular Expression I guess), I think the Regular Expression you are using is not efficient. I created a 60 MB log file with the log example you gave, and the RE parser I created loaded the file in less than a second (also, it uses 280 MB of memory, so you shouldn't have memory issues).
Here is the Regexp I used:
Code: Select all
(\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2},\d+?) (.*?) \[(.*?)\] (.*?):(\d+?) (.*)
To learn more about Regexp (and more precisely for LogMX), you can have a look at: https://logmx.com/docs/regex-parsers.html
Let me know if you still have trouble to load your log files (usually you shouldn't have any performance/memory issues with files smaller than 100 MB).
Xavier