Find largest files
How to find the largest files in the directory tree on Solaris.
cd to start point (is acceptable) and as super user run: ls -ailR | sort -rn +5 | more
Sample output under SuSE Linux:
deathstar:/opt # ls -ailR | sort -rn +5 | more 247704 -rwxr-xr-x 1 root root 18330960 Nov 20 2004 oxpsp1.exe 243985 -rwxr-xr-x 1 root root 15516048 Nov 20 2004 oxpsp2.exe 91951 -r--r--r-- 1 root root 14886200 Apr 13 2005 libsvx645li.so 91952 -r--r--r-- 1 root root 13648358 Apr 13 2005 libsw645li.so 96205 -rwxr-xr-x 1 root root 12957520 Sep 13 2004 thunderbird-bin 91931 -r--r--r-- 1 root root 10539873 Apr 13 2005 libsc645li.so 974337 -rwxr-xr-x 1 root root 9771152 Nov 7 2004 firefox-bin 447 -rwxr-xr-x 1 root root 9004320 Sep 26 14:02 libgklayout.so 91890 -r--r--r-- 1 root root 8576819 Apr 13 2005 libicudata.so.22.0 201928 -rwxr-xr-x 1 root root 6649080 Apr 7 2004 libgucharmap.so.3.0.3
Here’s a Korn shell script that finds largest files on Solaris:
#! /bin/ksh # igor@krazyworks.com # # Use this script to find the largest files in a particular directory. # # Variables: # $pathname name of the dir. which to search for large files clear echo "Enter path which to search: c" read pathname if [ -z $pathname ] then echo "Error: pathname cannot be null! Exiting..." exit fi if [ -d $pathname ] then clear echo " " echo "File Owner File Size File Date File Name" echo "________________________________________" echo " " du -koda "$pathname" | sort -rn | head -50 | awk '{print $2}' | while read file do if [ -f $file ] then owner=$(ls -als $file | awk '{print $4}') bsize=$(ls -als $file | awk '{print $6}') (( fsize = bsize / 1024 /1024 )) mdate=$(ls -als $file | awk '{print $7" "$8" "$9}') fpath=$(ls -als $file | awk '{print $10" "$11" "$12}') echo $owner" "$fsize" Mb "$mdate" "$fpath fi done else echo " " echo "Error: '$pathname' - no such directory! Exiting..." echo " " exit fi
Sample output of the above script on a Solaris 8 box:
File Owner File Size File Date File Name _____________________________________________ nbar 93 Mb Nov 21 08:14 /opt/SUNWnbaro/var/nbar/nbjobs.MYI nbar 60 Mb Nov 21 08:06 /opt/SUNWnbaro/var/nbar/nbjobfs.MYD nbar 59 Mb Nov 21 05:20 /opt/SUNWnbaro/var/nbar/nbwindows.MYI nbar 50 Mb Nov 21 08:14 /opt/SUNWnbaro/var/nbar/nbjobs.MYD nbar 43 Mb Nov 20 08:16 /opt/SUNWnbaro/var/nbar/nbwindows.MYD nbar 31 Mb Nov 21 08:06 /opt/SUNWnbaro/var/nbar/nbjobfs.MYI nbar 13 Mb Nov 21 05:20 /opt/SUNWnbaro/var/nbar/nbschedules.MYI root 13 Mb May 6 2001 /opt/VRTSob/jre/lib/rt.jar root 12 Mb May 11 2001 /opt/VRTSvcs/gui/jre/lib/rt.jar root 12 Mb Aug 7 2003 /opt/QLogic_Corporation/SANblade_Control_FX/scfx
3 Comments »
1 Pingbacks »
-
[…] You can find more information on dealing with out-of-disk-space situations here. […]

Looking for a way to continuously transfer and overwrite a large file (50 GB) to portable laptops from a server. The file contains powerpoints, and is constantly updated. Looking for the fastest way to do this. Wireless and internet connections are not always available on the portable laptops so internet resources will not work.
I need to transfer a very large file to another computer that is not on the same network. I can write a DVD, but I wonder if there is a more preferred way. I have tried to do a “Send File” from Yahoo Messenger, but it is really slow.
There are sites which allow people to send very large files even 50GB to others by uploading then a link is sent to the receiver to download. I was thus wondering is there anywhere online I can find links to some sort of videos I want. Also which is the best software for downloading streaming videos like those of youtube?