Karandeep Singh
Posted on May 29, 2024
Conducting a code review for Bash scripts is essential to ensure they are error-free, secure, and easy to maintain. Reviewing Bash scripts helps catch mistakes early, improve code quality, and ensures best practices are followed. Here's a detailed guide on how to review Bash scripts effectively, with explanations and examples of good and bad code for each step.
1. Understand the Purpose of the Script
Before reviewing, understand what the script is supposed to do. This helps in contextualizing the code and spotting deviations.
Good:
# This script backs up the user's home directory to /backup
Bad:
# backup script
2. Check for Shebang and Execution Permissions
Ensure the script starts with a shebang to specify the interpreter and that it has executable permissions.
Good:
#!/bin/bash
chmod +x script.sh
Bad:
#!/bin/sh
3. Syntax and Semantics
Look for syntax errors and semantic issues. Use tools like shellcheck
to detect common mistakes.
Good:
if [ -f "$file" ]; then
echo "File exists."
fi
Bad:
if [ -f "$file" ] then
echo "File exists."
4. Readability and Maintainability
Check for proper indentation, meaningful variable names, and adequate comments.
Good:
for file in *.txt; do
echo "Processing $file"
done
Bad:
for f in *.txt; do echo "Processing $f"; done
5. Error Handling
Ensure the script handles errors gracefully using proper error handling mechanisms.
Good:
set -euo pipefail
trap 'echo "Error occurred"; exit 1' ERR
Bad:
# No error handling
6. Security Considerations
Look for potential security issues like unchecked user input and improper handling of sensitive data.
Good:
if [[ "$user_input" =~ ^[a-zA-Z0-9_]+$ ]]; then
echo "Valid input"
fi
Bad:
eval $user_input
7. Performance and Efficiency
Assess the script for performance bottlenecks and unnecessary use of resources.
Good:
grep "pattern" file.txt
Bad:
cat file.txt | grep "pattern"
8. Adherence to Best Practices
Ensure the script follows best practices for Bash scripting.
Good:
result=$(command)
Bad:
result=`command`
9. Dependency Management
Identify any external dependencies and ensure they are clearly documented.
Good:
# Requires rsync
if ! command -v rsync &> /dev/null; then
echo "rsync could not be found"
exit 1
fi
Bad:
rsync -avh source/ destination/
10. Portability
Check if the script uses features or commands specific to a particular shell or system.
Good:
# POSIX compliant
if [ -d "$DIR" ]; then
echo "Directory exists."
fi
Bad:
[[ -d "$DIR" ]] && echo "Directory exists."
11. Documentation
Verify that the script includes a header comment explaining its purpose and usage instructions.
Good:
# Script to backup user's home directory
# Usage: ./backup.sh
Bad:
# Backup script
12. Testing
Ensure the script has been tested in different environments and scenarios.
Good:
# Test script
./test_backup.sh
Bad:
# No testing
13. Variable Naming
Use meaningful and descriptive variable names to improve readability.
Good:
file_count=0
Bad:
fc=0
14. Avoid Hardcoding Values
Use variables instead of hardcoding values to make the script more flexible.
Good:
backup_dir="/backup"
Bad:
cd /backup
15. Use Functions for Reusable Code
Encapsulate reusable code in functions to improve modularity and readability.
Good:
backup_files() {
tar -czf backup.tar.gz /home/user
}
Bad:
tar -czf backup.tar.gz /home/user
16. Check Command Success
Always check if a command succeeded and handle the failure case appropriately.
Good:
if ! cp source.txt destination.txt; then
echo "Copy failed"
exit 1
fi
Bad:
cp source.txt destination.txt
17. Use Meaningful Exit Codes
Use appropriate exit codes to indicate the script's status.
Good:
exit 0
Bad:
exit 1
18. Avoid Useless Use of cat
Combine commands to avoid unnecessary use of cat
.
Good:
grep "pattern" file.txt
Bad:
cat file.txt | grep "pattern"
19. Quotes Around Variables
Always quote variables to prevent word splitting and globbing issues.
Good:
echo "File: $file"
Bad:
echo File: $file
20. Avoid Global Variables
Use local variables within functions to avoid side effects.
Good:
main() {
local file_count=0
}
Bad:
file_count=0
21. Proper Use of Arrays
Use arrays for lists of items to simplify the code.
Good:
files=(file1.txt file2.txt)
for file in "${files[@]}"; do
echo "Processing $file"
done
Bad:
file1=file1.txt
file2=file2.txt
for file in $file1 $file2; do
echo "Processing $file"
done
22. Avoiding Command Substitution in Loops
Avoid using command substitution within loops for better performance.
Good:
while read -r line; do
echo "$line"
done < file.txt
Bad:
for line in $(cat file.txt); do
echo "$line"
done
23. Proper Use of printf
Use printf
instead of echo
for better formatting control.
Good:
printf "File: %s\n" "$file"
Bad:
echo "File: $file"
24. Check for Unset Variables
Use set -u
to treat unset variables as an error.
Good:
set -u
echo "Variable: ${var:-default}"
Bad:
echo "Variable: $var"
25. Proper Use of trap
Use trap
to handle cleanup tasks and ensure they run even if the script exits unexpectedly.
Good:
trap 'rm -f temp.txt; exit' INT TERM
Bad:
# No cleanup
26. Avoiding Multiple Redirections
Combine redirections to avoid multiple file handles.
Good:
{
echo "Line 1"
echo "Line 2"
} > output.txt
Bad:
echo "Line 1" > output.txt
echo "Line 2" >> output.txt
27. Using Built-in Shell Commands
Prefer built-in shell commands over external utilities where possible.
Good:
files=$(ls)
Bad:
files=$(ls -1)
28. Avoiding the Use of eval
Avoid eval
to prevent potential security risks.
Good:
cmd="ls"
$cmd
Bad:
eval $cmd
29. Proper Use of read
Use read
with proper options to handle input safely.
Good:
read -r user_input
Bad:
read user_input
30. Using ||
and &&
for Command Chaining
Use ||
and &&
for conditional command execution.
Good:
command1 && command2
command1 || echo "Command1 failed"
Bad:
if command1; then
command2
fi
if ! command1; then
echo "Command1 failed"
fi
31. Using case
Instead of Multiple if
Statements
Use case
for multiple conditions to improve readability.
Good:
case $var in
pattern1) echo "Pattern 1";;
pattern2) echo "Pattern
2";;
esac
Bad:
if [ "$var" == "pattern1" ]; then
echo "Pattern 1"
elif [ "$var" == "pattern2" ]; then
echo "Pattern 2"
fi
32. Properly Handling File Descriptors
Use file descriptors to manage input/output streams efficiently.
Good:
exec 3< input.txt
while read -r line <&3; do
echo "$line"
done
exec 3<&-
Bad:
while read -r line; do
echo "$line"
done < input.txt
33. Using select
for Menu Options
Use select
to create simple menus.
Good:
select option in "Option 1" "Option 2" "Quit"; do
case $option in
"Option 1") echo "You chose Option 1";;
"Option 2") echo "You chose Option 2";;
"Quit") break;;
esac
done
Bad:
echo "1. Option 1"
echo "2. Option 2"
echo "3. Quit"
read -r choice
case $choice in
1) echo "You chose Option 1";;
2) echo "You chose Option 2";;
3) exit;;
esac
34. Using dirname
and basename
Use dirname
and basename
to handle file paths.
Good:
dir=$(dirname "$file_path")
file=$(basename "$file_path")
Bad:
dir=${file_path%/*}
file=${file_path##*/}
35. Using mktemp
for Temporary Files
Use mktemp
to create temporary files securely.
Good:
tmpfile=$(mktemp)
echo "Temporary file: $tmpfile"
Bad:
tmpfile="/tmp/tempfile.$$"
echo "Temporary file: $tmpfile"
By following these guidelines and using these examples, you can conduct a thorough and effective code review of Bash scripts, ensuring they are robust, secure, and maintainable.
For more advanced Bash scripting tips, check out this article on Advanced String Operations in Bash: Building Custom Functions.
Posted on May 29, 2024
Join Our Newsletter. No Spam, Only the good stuff.
Sign up to receive the latest update from our blog.