UNIX Socket FAQ

A forum for questions and answers about network programming on Linux and all other Unix-like systems

You are not logged in.

#1 2009-01-09 12:20 PM

Jana
Member
Registered: 2008-03-24
Posts: 21

Re: spliting the large size file

HI
I have a very large file which has size of 25GB, with this size I am not able to open this file. so tried to split the file into two pieces but I do not have enough space in my system for new files getting created.

Actually I am interested only in the last 1000 lines in this file. How can get only last 1000 lines of this file to a new file ??? Please let me know how to achieve this.

I searched in the web but I did not find example for this case.

Thanks
Jana

Offline

#2 2009-01-09 01:46 PM

RobSeace
Administrator
From: Boston, MA
Registered: 2002-06-12
Posts: 3,826
Website

Re: spliting the large size file

What do you mean, you're not able to open it?  Does open()/fopen() fail?  Are you
compiling with 64-bit file support?

Or, are you attempting to open it with some other specific app?  If so, what one,
exactly?  If it's an editor of some sort, chances are that it's attempting to load the
entire contents of the file into RAM, which is likely what's failing...

All you should need is something compiled with 64-bit file support (if on a 64-bit
platform, you're all set; but, on a 32-bit platform, you need to compile specially), which
opens the file, and just reads out the portion you want...

Actually I am interested only in the last 1000 lines in this file. How can get only last 1000 lines of this file to a new file ???

Have you tried "tail -n 1000 oldfile > newfile"?

Offline

#3 2009-01-10 01:45 AM

i3839
Oddministrator
From: Amsterdam
Registered: 2003-06-07
Posts: 2,230

Re: spliting the large size file

See also the comment about O_LARGEFILE in the open(2) manpage,
and then you can seek to the end of it with fseek.

Offline

Board footer

Powered by FluxBB