Large text file editing
#1
Hi,
I desperatley need to edit a text log file which is 900meg big. Only problem is, it seems to take ages to load and locks the server up. Can anyone suggest a good text editor that loads directly from disc rather than trying to load it all into RAM?
Cheers
FF
I desperatley need to edit a text log file which is 900meg big. Only problem is, it seems to take ages to load and locks the server up. Can anyone suggest a good text editor that loads directly from disc rather than trying to load it all into RAM?
Cheers
FF
#5
Scooby Regular
I used to use Squid @ work and if you left the logging on, it would just keep growing and growing. I had to do a report and the log was well over 1Gb. I used a utility to split the log file into 50Mb chunks. There's loads of free utilities out there that'll do that.
It's the one thing I hated with Squid, so I moved to ISA. At least it creates a seperate file for each day. OK, pain in the @rse to search, but I can always combine those and use grep to search.
Stefan
It's the one thing I hated with Squid, so I moved to ISA. At least it creates a seperate file for each day. OK, pain in the @rse to search, but I can always combine those and use grep to search.
Stefan
#6
We use Proxy Server 2, and it does the same as ISA, generating a seperate file for each day. With in-excess of 3000 users and thats only one domain of several we have, the log file hits 800meg at day without fail.
The proxy keeps crashing and MS want me to filter through the file and locate what was going on when it crashed.
Maybe I'll hunt for a splitter but i'm sure there must be something that should do the job.
cheers
The proxy keeps crashing and MS want me to filter through the file and locate what was going on when it crashed.
Maybe I'll hunt for a splitter but i'm sure there must be something that should do the job.
cheers
#7
Scooby Regular
Join Date: Sep 2001
Location: Kingston ( Surrey, not Jamaica )
Posts: 4,670
Likes: 0
Received 0 Likes
on
0 Posts
Surely if you only want to filter the file, then just punt it onto a *nix box and grep out the entries you want, or split the file into n thousand line subfiles and use emacs/vi etc.
If it's just the last few '000 lines you are interested in (before it crashed) then tail -1000 file > file.out is trivial.
There maybe someone about who can chop it up with perl if that would be easier...
Steve
If it's just the last few '000 lines you are interested in (before it crashed) then tail -1000 file > file.out is trivial.
There maybe someone about who can chop it up with perl if that would be easier...
Steve
Trending Topics
#8
Scooby Regular
The fact that it's a 800Mb log file is probably crashing it
If you know what you're looking for, I'd use grep to search through the file. You don't need to use Unix as there's plenty of grep utilities ported to Win32.
Stefan
If you know what you're looking for, I'd use grep to search through the file. You don't need to use Unix as there's plenty of grep utilities ported to Win32.
Stefan
#11
you need a unix box
Anyway, if you know what your looking for follow Steve's advise. There is a port of grep for windows http://www.wingrep.com
Also, again as Steve says, a perl script is the best idea for splitting the file too. I'm assuming if you had perl you can build your own, however I can throw one together.
As other have said, will emacs or vim not open something of this size?? I have never tried it as my log files are backed up daily and never get that large.
Can you get back with your outcome, I'm actually quite interested in this (yep, I need help), and I don't want to have to go to the trouble of creating a 900Mb file of random chars.
Anyway, if you know what your looking for follow Steve's advise. There is a port of grep for windows http://www.wingrep.com
Also, again as Steve says, a perl script is the best idea for splitting the file too. I'm assuming if you had perl you can build your own, however I can throw one together.
As other have said, will emacs or vim not open something of this size?? I have never tried it as my log files are backed up daily and never get that large.
Can you get back with your outcome, I'm actually quite interested in this (yep, I need help), and I don't want to have to go to the trouble of creating a 900Mb file of random chars.
Thread
Thread Starter
Forum
Replies
Last Post
Mattybr5@MB Developments
Full Cars Breaking For Spares
28
28 December 2015 11:07 PM
Mattybr5@MB Developments
Full Cars Breaking For Spares
12
18 November 2015 07:03 AM
alcazar
Computer & Technology Related
2
29 September 2015 07:18 PM