CIS126RH | RHEL System Administration 1 Mesa Community College
Learning Objectives
1
Understand standard streams
Know stdin, stdout, and stderr and their file descriptors
2
Redirect output to files
Use > and >> to capture stdout and stderr
3
Redirect input from files
Use < and here documents to provide input
4
Connect commands with pipes
Build powerful command pipelines using |
Standard Streams
Every Linux process has three standard streams for data flow. By default, they connect to the terminal, but redirection changes where data flows.
stdin
0
Standard Input Data going INTO the command Default: Keyboard
stdout
1
Standard Output Normal output FROM command Default: Terminal screen
stderr
2
Standard Error Error messages FROM command Default: Terminal screen
Default Data Flow
Keyboard
→stdin (0)→
Command
Command
→stdout (1)→
Screen
Command
→stderr (2)→
Screen
Key Insight: stdout and stderr both go to the screen by default, but they're separate streams that can be redirected independently.
Output Redirection Operators
Operator
Description
Example
>
Redirect stdout to file (overwrite)
ls > files.txt
>>
Redirect stdout to file (append)
echo "text" >> log.txt
2>
Redirect stderr to file (overwrite)
cmd 2> errors.txt
2>>
Redirect stderr to file (append)
cmd 2>> errors.txt
&>
Redirect both stdout and stderr
cmd &> all.txt
2>&1
Redirect stderr to same place as stdout
cmd > out.txt 2>&1
Redirecting stdout
# Redirect output to new file (overwrites if exists!)
ls -l /etc > etc_listing.txt
# View the captured output
cat etc_listing.txt
total 1348
drwxr-xr-x. 3 root root 4096 Nov 1 10:00 abrt
-rw-r--r--. 1 root root 16 Nov 1 10:00 adjtime
...# Append to existing file (preserves content)
echo "Listing generated on $(date)" >> etc_listing.txt
# Create empty file (or truncate existing)
> empty_file.txt
# Overwrite protection with noclobber
set -o noclobber # Enable protection
ls > existing.txt # Error if file exists
ls >| existing.txt # Force overwrite anyway
set +o noclobber # Disable protection
Redirecting stderr
# Command that produces both stdout and stderr
ls /etc /nonexistent
/etc:
hostname hosts passwd ...
ls: cannot access '/nonexistent': No such file or directory# Redirect only stderr to file
ls /etc /nonexistent 2> errors.txt
/etc:
hostname hosts passwd ...# Check the error file
cat errors.txt
ls: cannot access '/nonexistent': No such file or directory# Discard stderr (send to /dev/null)
ls /etc /nonexistent 2> /dev/null
/etc:
hostname hosts passwd ...# Append errors to log file
find / -name "*.conf" 2>> search_errors.log
Combining stdout and stderr
# Method 1: Redirect both to same file (bash shorthand)
ls /etc /nonexistent &> all_output.txt
# Method 2: Redirect stdout, then stderr to same place
ls /etc /nonexistent > all_output.txt 2>&1
# ORDER MATTERS! This is WRONG:
ls /etc /nonexistent 2>&1 > output.txt # stderr still goes to terminal!# Redirect to separate files
ls /etc /nonexistent > stdout.txt 2> stderr.txt
# Append both to same file
command >> combined.log 2>&1
# Discard all output (stdout and stderr)
command &> /dev/null
command > /dev/null 2>&1 # Same effect, more portable
⚠ Order Matters: In cmd > file 2>&1, redirect stdout first, then point stderr to it. Reversed order doesn't work!
Input Redirection
# Redirect input from a file
wc -l < /etc/passwd
45# Compare: command reads file vs shell provides input
wc -l /etc/passwd # wc opens the file (shows filename)45 /etc/passwd
wc -l < /etc/passwd # shell provides content (no filename)45# Send email from file
mail -s "Report" user@example.com < report.txt
# Combine input and output redirection
sort < unsorted.txt > sorted.txt
# Multiple redirections
command < input.txt > output.txt 2> errors.txt
Subtle Difference: With <, the shell opens the file and feeds it to the command. The command doesn't know the filename.
Here Documents
A here document (heredoc) provides multi-line input inline in a script or command, without needing a separate file.
# Basic here document syntax
cat << EOF
This is line 1
This is line 2
Variables expand: $HOME
EOF
# Prevent variable expansion with quotes
cat << 'EOF'
Variables do NOT expand: $HOME
This prints literally: $(whoami)
EOF
# Use in scripts to create files
cat << EOF > config.txt
server=localhost
port=8080
user=$USER
EOF
# The delimiter can be any word (EOF is convention)
cat << MYEND
Content here
MYEND
Here Strings
# Here string - single line input
wc -w <<< "count the words in this string"
6# Useful with commands that read stdin
bc <<< "5 * 4 + 3"
23# With variables
greeting="Hello World"
cat <<< "$greeting"
Hello World# Compare: echo with pipe vs here string
echo "test data" | wc -c # Creates subshell
wc -c <<< "test data" # No subshell# Read first field from string
read first rest <<< "apple banana cherry"
echo $first
apple
Here Strings are simpler than here documents for single-line input, and more efficient than echo | command.
Pipes
A pipe connects the stdout of one command directly to the stdin of another, creating a data processing pipeline.
Command 1
→stdout→ | →stdin→
Command 2
# Basic pipe - output of ls becomes input to wc
ls /etc | wc -l
245# Filter output through grep
cat /etc/passwd | grep "bash"
# Chain multiple commands
cat /var/log/messages | grep "error" | wc -l
# Sort and remove duplicates
cut -d: -f7 /etc/passwd | sort | uniq
Building Pipelines
cat access.log
|
grep "404"
|
cut -d' ' -f7
|
sort
|
uniq -c
|
sort -rn
# Find most common 404 errors in web log
cat access.log | grep "404" | cut -d' ' -f7 | sort | uniq -c | sort -rn
# Step by step:
cat access.log # Read the log file
| grep "404" # Keep only 404 error lines
| cut -d' ' -f7 # Extract the URL field
| sort # Sort URLs alphabetically
| uniq -c # Count unique occurrences
| sort -rn # Sort by count, highest first 47 /missing-page.html
23 /old-link.php
12 /typo.html
Common Pipeline Patterns
# Filter and count
grep "pattern" file | wc -l
# Sort and deduplicate
sort file | uniq
# Search process list
ps aux | grep "httpd"
# View paginated output
cat /var/log/messages | less
# Extract and format columns
df -h | awk '{print $1, $5}'
# Find large files
du -sh * | sort -rh | head -10
# Monitor log in real-time, filtered
tail -f /var/log/messages | grep --line-buffered "error"
# Count file types in directory
find . -type f | sed 's/.*\.//' | sort | uniq -c | sort -rn
The tee Command
tee reads from stdin and writes to both stdout AND one or more files simultaneously - like a T-junction in plumbing.
Command
→stdout→
tee file.txt
→
Screen
↓ also writes to
file.txt
# Save output while also viewing it
ls -l /etc | tee etc_listing.txt
# Append instead of overwrite
echo "new entry" | tee -a logfile.txt
# Write to multiple files
command | tee file1.txt file2.txt file3.txt
# Use with sudo (common pattern)
echo "setting=value" | sudo tee /etc/config.conf
The xargs Command
xargs builds and executes commands from stdin, converting input into arguments for another command.
# Delete files found by find
find /tmp -name "*.tmp" | xargs rm
# Handle filenames with spaces (-0 and -print0)
find . -name "*.log" -print0 | xargs -0 rm
# Limit arguments per command (-n)
echo "a b c d" | xargs -n 2 echo
a b
c d# Interactive mode - confirm each command (-p)
find . -name "*.bak" | xargs -p rm
# Use placeholder for argument position (-I)
find . -name "*.txt" | xargs -I {} cp {} /backup/
# Parallel execution (-P)
find . -name "*.jpg" | xargs -P 4 -I {} convert {} -resize 50% small_{}
Practical Examples
# Log all errors from a script
./backup.sh 2>> /var/log/backup_errors.log
# Run command silently (discard all output)
cron_job.sh &> /dev/null
# Save command output with timestamp
{ echo "=== $(date) ==="; df -h; } >> disk_report.txt
# Create file requiring root, with user content
echo "nameserver 8.8.8.8" | sudo tee /etc/resolv.conf
# Process files found by find
find /var/log -name "*.log" -mtime +30 | xargs -r gzip
# Compare two directories
diff <(ls dir1) <(ls dir2)
# Monitor multiple logs simultaneously
tail -f /var/log/messages /var/log/secure | grep -E "(error|fail)"
Common Mistakes
# WRONG: Redirecting to the same file you're reading
sort file.txt > file.txt # Results in empty file!# RIGHT: Use a temporary file or sponge
sort file.txt > temp.txt && mv temp.txt file.txt
sort file.txt | sponge file.txt # (requires moreutils)# WRONG: stderr redirect order
command 2>&1 > file.txt # stderr still goes to terminal# RIGHT: Redirect stdout first
command > file.txt 2>&1
# WRONG: Expecting pipe to modify original
cat file.txt | grep "pattern" | sort # file.txt unchanged# WRONG: Using redirect where pipe is needed
ls > wc -l # Creates file named "wc"!# RIGHT: Use pipe
ls | wc -l
Quick Reference
Operator
Description
cmd > file
Redirect stdout to file (overwrite)
cmd >> file
Redirect stdout to file (append)
cmd 2> file
Redirect stderr to file
cmd 2>&1
Redirect stderr to same place as stdout
cmd &> file
Redirect both stdout and stderr to file
cmd < file
Redirect stdin from file
cmd << EOF
Here document (multi-line input)
cmd <<< "string"
Here string (single-line input)
cmd1 | cmd2
Pipe stdout of cmd1 to stdin of cmd2
cmd | tee file
Send output to file AND stdout
Key Takeaways
1
Three streams: stdin (0), stdout (1), stderr (2)
2
Output: > overwrites, >> appends, 2> for stderr
3
Input: < from file, << here doc, <<< here string
4
Pipes | connect commands; tee splits output; xargs builds arguments
Graded Lab
Redirect ls output to a file, then append a timestamp with echo
Run find across / and send errors to /dev/null while viewing results
Use a here document to create a multi-line configuration file
Build a pipeline to find the 10 largest files in /var
Use tee to save and display command output simultaneously