Command / Code
find / -type f -size +100M -ls 2>/dev/null
Description
How to Find Large Files on Linux?
Quick Answer: Use find / -type f -size +100M -ls 2>/dev/null
to locate all files larger than 100MB on your Linux system, or du -sh /* | sort -hr
to see directory sizes sorted by largest first.
Basic Command Examples
# Find files larger than 100MB system-wide
find / -type f -size +100M -ls 2>/dev/null
# Find large files in specific directory
find /home -type f -size +500M -exec ls -lh {} \;
# Show directory sizes sorted by largest
du -sh /* 2>/dev/null | sort -hr
Step-by-Step Guide to Find Large Files
- Open your terminal and decide on the size threshold (e.g., 100MB, 1GB)
- Choose your search location – use
/
for entire system or specific path like/home
- Run the find command with your size parameter:
find /path -type f -size +100M
- Add formatting options like
-ls
for detailed output or-exec ls -lh {} \;
for human-readable sizes - Redirect errors by adding
2>/dev/null
to hide permission denied messages - Sort results by size using
| sort -k5 -hr
to see largest files first - Save results to file with
> large_files.txt
for later analysis - Review and clean up identified large files that are no longer needed
Command Reference Table
Command | Purpose | Example | Notes |
---|---|---|---|
find /path -size +100M | Find files larger than 100MB | find /home -size +1G -type f | Use +/-/exact for size comparison |
du -sh /path/* | Show directory sizes | du -sh /var/* | sort -hr | Human-readable format with sorting |
find /path -size +100M -delete | Find and delete large files | find /tmp -size +500M -delete | Caution: Permanently deletes files |
ncdu /path | Interactive disk usage analyzer | ncdu /home/user | Requires ncdu installation |
find /path -size +100M -mtime +30 | Find large old files | find /var/log -size +50M -mtime +7 | Combines size and age criteria |
Real-World Use Cases
Server Cleanup Scenario
# Find log files larger than 100MB older than 7 days
find /var/log -name "*.log" -size +100M -mtime +7 -ls
# Find backup files consuming space
find /backup -name "*.tar.gz" -size +1G -exec ls -lh {} \;
Home Directory Management
# Find large media files in home directory
find ~/ -type f \( -name "*.mp4" -o -name "*.mkv" -o -name "*.iso" \) -size +500M
# Identify large downloads
find ~/Downloads -type f -size +100M -printf "%s %p\n" | sort -nr
Development Environment
# Find large files in project directories
find ~/projects -name "node_modules" -type d -exec du -sh {} \;
# Locate large Docker images or containers data
find /var/lib/docker -type f -size +500M 2>/dev/null
Advanced Filtering Options
Size Specifications
+100M
– Larger than 100 megabytes-100M
– Smaller than 100 megabytes100M
– Exactly 100 megabytes+1G
– Larger than 1 gigabyte+1024k
– Larger than 1024 kilobytes
Time-Based Filtering
# Files modified in last 7 days
find / -type f -size +100M -mtime -7
# Files not accessed in 30 days
find / -type f -size +100M -atime +30
# Files created in last 24 hours
find / -type f -size +100M -ctime -1
Alternative Tools and Methods
Using du Command
# Top 10 largest directories
du -sh /* 2>/dev/null | sort -hr | head -10
# Largest files in current directory tree
du -a . | sort -nr | head -20
Using ls with Sorting
# Largest files in directory (recursive)
find /path -type f -exec ls -la {} \; | sort -k5 -nr | head -10
Troubleshooting Common Issues
Permission Denied Errors
Problem: Getting “Permission denied” messages
Solution: Add 2>/dev/null
to suppress errors or run with sudo
sudo find / -type f -size +100M -ls 2>/dev/null
Command Taking Too Long
Problem: Search is very slow on large filesystems
Solution: Limit search scope or use background processing
# Search specific directories only
find /home /var -type f -size +100M -ls 2>/dev/null
# Run in background
nohup find / -type f -size +100M -ls 2>/dev/null > large_files.txt &
Out of Memory Errors
Problem: System runs out of memory during search
Solution: Use pagination or process results incrementally
# Process results in smaller chunks
find / -type f -size +100M -print0 | xargs -0 -n 10 ls -lh
Performance Optimization Tips
- Limit search scope – Don’t search entire filesystem unless necessary
- Use specific file types – Add
-name "*.log"
to target specific extensions - Exclude mount points – Use
-mount
to avoid searching mounted filesystems - Run during low usage – Schedule large searches during off-peak hours
- Combine with other tools – Use
ionice
to reduce I/O priority
Related Commands and Tools
ncdu
– Interactive disk usage analyzer with ncurses interfacebaobab
– Graphical disk usage analyzer for GNOMEqdirstat
– Qt-based directory statistics tooltree
– Display directory structure with sizeslsof
– List open files (useful for finding files in use)
Quick Reference Commands
# Most common use cases
find / -type f -size +100M -ls 2>/dev/null # System-wide large files
find /home -type f -size +500M -exec ls -lh {} \; 2>/dev/null # Home directory large files
du -sh /var/* | sort -hr # Largest directories in /var
find /tmp -type f -size +50M -mtime +1 -delete # Clean old temp files
This comprehensive find large files on linux approach helps you efficiently identify and manage large files consuming disk space on your Linux system, whether for routine maintenance, troubleshooting storage issues, or optimizing system performance.
Detailed Explanation
ok
This bash command searches for and displays all large files in the system.
Here’s a detailed explanation of each part:
find – The command to search for files and directories in the filesystem
/ – The starting directory for the search (root of the filesystem, so it searches the entire system)
-type f – Specifies that we want to search only for regular files (excludes directories, symbolic links, etc.)
-size +100M – Searches for files larger than 100 megabytes. The + indicates “greater than”, while M specifies the unit in megabytes
-ls – Shows results in detailed format, similar to the ls -l command, including permissions, owner, size, modification date, and full path
2>/dev/null – Redirects all error messages (stderr) to /dev/null, which is like a system “black hole”. This hides errors that might appear when find tries to access directories for which you don’t have permissions
In summary: the command finds all files in the system that are larger than 100MB and displays them with detailed information, suppressing any error messages for inaccessible directories.
It’s very useful for identifying files that take up a lot of disk space and cleaning up the system.