Notices
Computer & Technology Related Post here for help and discussion of computing and related technology. Internet, TVs, phones, consoles, computers, tablets and any other gadgets.

Large text file editing

Thread Tools
 
Search this Thread
 
Old 04 December 2003, 09:37 PM
  #1  
fastfrank
Scooby Regular
Thread Starter
 
fastfrank's Avatar
 
Join Date: Apr 2003
Posts: 867
Likes: 0
Received 0 Likes on 0 Posts
Post

Hi,
I desperatley need to edit a text log file which is 900meg big. Only problem is, it seems to take ages to load and locks the server up. Can anyone suggest a good text editor that loads directly from disc rather than trying to load it all into RAM?


Cheers
FF
Old 04 December 2003, 09:39 PM
  #2  
zhastaph
Scooby Regular
 
zhastaph's Avatar
 
Join Date: Sep 2003
Location: Isle of Wight
Posts: 2,720
Likes: 0
Received 0 Likes on 0 Posts
Red face

900meg ??!!!??!?
Old 04 December 2003, 09:42 PM
  #3  
fastfrank
Scooby Regular
Thread Starter
 
fastfrank's Avatar
 
Join Date: Apr 2003
Posts: 867
Likes: 0
Received 0 Likes on 0 Posts
Post

yes indeedy, its a proxy server log. I need to send a report to Microsoft but i can't get into it cos its so damn big.
Old 04 December 2003, 09:43 PM
  #4  
michael_clarkson
Scooby Regular
 
michael_clarkson's Avatar
 
Join Date: Jan 2001
Posts: 253
Likes: 0
Received 0 Likes on 0 Posts
Post

Can't you use split and chop the file into managable chunks, or if you now what you want to change use sed or something similiar
Old 04 December 2003, 09:50 PM
  #5  
ozzy
Scooby Regular
 
ozzy's Avatar
 
Join Date: Nov 1999
Location: Scotland, UK
Posts: 10,504
Likes: 0
Received 1 Like on 1 Post
Post

I used to use Squid @ work and if you left the logging on, it would just keep growing and growing. I had to do a report and the log was well over 1Gb. I used a utility to split the log file into 50Mb chunks. There's loads of free utilities out there that'll do that.

It's the one thing I hated with Squid, so I moved to ISA. At least it creates a seperate file for each day. OK, pain in the @rse to search, but I can always combine those and use grep to search.

Stefan
Old 04 December 2003, 09:56 PM
  #6  
fastfrank
Scooby Regular
Thread Starter
 
fastfrank's Avatar
 
Join Date: Apr 2003
Posts: 867
Likes: 0
Received 0 Likes on 0 Posts
Post

We use Proxy Server 2, and it does the same as ISA, generating a seperate file for each day. With in-excess of 3000 users and thats only one domain of several we have, the log file hits 800meg at day without fail.

The proxy keeps crashing and MS want me to filter through the file and locate what was going on when it crashed.

Maybe I'll hunt for a splitter but i'm sure there must be something that should do the job.

cheers
Old 05 December 2003, 12:05 AM
  #7  
stevem2k
Scooby Regular
 
stevem2k's Avatar
 
Join Date: Sep 2001
Location: Kingston ( Surrey, not Jamaica )
Posts: 4,670
Likes: 0
Received 0 Likes on 0 Posts
Post

Surely if you only want to filter the file, then just punt it onto a *nix box and grep out the entries you want, or split the file into n thousand line subfiles and use emacs/vi etc.

If it's just the last few '000 lines you are interested in (before it crashed) then tail -1000 file > file.out is trivial.

There maybe someone about who can chop it up with perl if that would be easier...


Steve
Old 05 December 2003, 10:38 AM
  #8  
ozzy
Scooby Regular
 
ozzy's Avatar
 
Join Date: Nov 1999
Location: Scotland, UK
Posts: 10,504
Likes: 0
Received 1 Like on 1 Post
Post

The fact that it's a 800Mb log file is probably crashing it

If you know what you're looking for, I'd use grep to search through the file. You don't need to use Unix as there's plenty of grep utilities ported to Win32.

Stefan
Old 05 December 2003, 06:56 PM
  #9  
fastfrank
Scooby Regular
Thread Starter
 
fastfrank's Avatar
 
Join Date: Apr 2003
Posts: 867
Likes: 0
Received 0 Likes on 0 Posts
Post

cheers for the tips guys.

PS. Nice web site Ozzy :- )

FF
Old 05 December 2003, 07:27 PM
  #10  
judgejules
Scooby Regular
 
judgejules's Avatar
 
Join Date: Nov 2000
Posts: 1,227
Likes: 0
Received 0 Likes on 0 Posts
Post

PFE or VIM

I believe VIM is better as it will seek to the place you scroll too rather than loading the whole file in one go.

HTH

~Jules
Old 05 December 2003, 08:53 PM
  #11  
Gedi
Scooby Regular
 
Gedi's Avatar
 
Join Date: Jan 2003
Posts: 932
Likes: 0
Received 0 Likes on 0 Posts
Post

you need a unix box

Anyway, if you know what your looking for follow Steve's advise. There is a port of grep for windows http://www.wingrep.com

Also, again as Steve says, a perl script is the best idea for splitting the file too. I'm assuming if you had perl you can build your own, however I can throw one together.


As other have said, will emacs or vim not open something of this size?? I have never tried it as my log files are backed up daily and never get that large.
Can you get back with your outcome, I'm actually quite interested in this (yep, I need help), and I don't want to have to go to the trouble of creating a 900Mb file of random chars.

Related Topics
Thread
Thread Starter
Forum
Replies
Last Post
Mattybr5@MB Developments
Full Cars Breaking For Spares
28
28 December 2015 11:07 PM
Mattybr5@MB Developments
Full Cars Breaking For Spares
12
18 November 2015 07:03 AM
Ganz1983
Subaru
5
02 October 2015 09:22 AM
alcazar
Computer & Technology Related
2
29 September 2015 07:18 PM
Littleted
Computer & Technology Related
0
25 September 2015 08:44 AM



Quick Reply: Large text file editing



All times are GMT +1. The time now is 05:10 AM.