Bash and Shell Scripting

The command line is one of the most powerful interfaces available to any developer, sysadmin, or power user. Mastering shell scripting means you can automate repetitive tasks, glue together complex pipelines, and manage infrastructure with confidence. Whether you are writing a quick one-liner or building a multi-hundred-line deployment script, the principles covered in this hub will serve you every day.

Shell scripting sits at the crossroads of system administration, software development, and data processing. A well-written shell script can replace dozens of manual steps with a single invocation. The UNIX philosophy of small, composable tools connected by pipes remains remarkably relevant decades after its inception, and Bash is the glue that holds it all together.

Why Learn Shell Scripting?

Every server you SSH into, every CI/CD pipeline you configure, and every container you build starts with a shell. Understanding how to navigate and script the shell is not optional for anyone working in technology -- it is foundational. Here are a few concrete reasons to invest time in shell mastery:

  • Automation: Replace manual, error-prone steps with reproducible scripts. A ten-step deployment procedure becomes a single command that runs identically every time.
  • Portability: Bash is available on virtually every Linux distribution, macOS, and Windows (via WSL). Scripts you write today will run on servers you provision years from now.
  • Speed: For many file-manipulation and text-processing tasks, a shell one-liner is faster to write and run than a Python or Node.js program. When you need to rename a thousand files, parse a log, or chain three commands together, the shell is unbeatable.
  • Glue: Shell scripts excel at orchestrating other programs, combining their strengths without rewriting logic. You can pipe the output of a database query into a text processor and then into an email -- all in one line.
  • Career Value: Shell proficiency is expected in DevOps, SRE, backend engineering, data engineering, and security roles. It shows up in interviews, on-call rotations, and daily workflow.

A Quick Taste

Below is a small script that backs up a directory, timestamps the archive, and removes backups older than seven days:

#!/usr/bin/env bash
set -euo pipefail

SOURCE_DIR="/var/www/myapp"
BACKUP_DIR="/backups"
TIMESTAMP=$(date +%Y%m%d_%H%M%S)
ARCHIVE="${BACKUP_DIR}/myapp_${TIMESTAMP}.tar.gz"

tar -czf "$ARCHIVE" -C "$(dirname "$SOURCE_DIR")" "$(basename "$SOURCE_DIR")"
echo "Backup created: $ARCHIVE"

# Remove backups older than 7 days
find "$BACKUP_DIR" -name "myapp_*.tar.gz" -mtime +7 -delete
echo "Old backups cleaned."

This short script already demonstrates several key concepts: the shebang line, strict mode with set -euo pipefail, variable expansion, command substitution, and the find utility. Each of these topics is covered in depth across the dedicated guides listed below.

Here is another practical example -- a script that checks whether a set of services are running and reports their status:

#!/usr/bin/env bash
set -euo pipefail

services=("nginx" "postgresql" "redis-server")

for svc in "${services[@]}"; do
    if systemctl is-active --quiet "$svc"; then
        printf "%-20s %s\n" "$svc" "RUNNING"
    else
        printf "%-20s %s\n" "$svc" "STOPPED"
    fi
done

Even these modest examples illustrate the breadth of what Bash can do with very little code.

What This Hub Covers

The shell scripting section is organized into focused topics so you can jump directly to what you need or work through them sequentially for a comprehensive education.

Core Language

  • Bash Fundamentals -- Variables, quoting, conditionals, loops, arrays, positional parameters, and exit codes. Start here if you are new to scripting.
  • Advanced Bash -- Associative arrays, functions, traps, strict mode, process substitution, here strings, and debugging techniques for production-grade scripts.

Text Processing

  • Sed and Awk -- The twin pillars of stream editing and structured text processing. Learn substitution, address ranges, field splitting, and powerful one-liners that solve real problems.
  • Regular Expressions Guide -- BRE, ERE, and PCRE syntax, character classes, quantifiers, anchors, groups, lookaheads, and practical patterns for log parsing and validation.

Productivity and Environment

  • CLI Productivity -- tmux, aliases, fzf, ripgrep, dotfile management, direnv, and the Starship prompt. Make your terminal a joy to use every single day.
  • Zsh and Fish -- Oh My Zsh, Powerlevel10k, Fish shell, plugin management, completions, and tips for migrating from Bash to a more modern interactive shell.

Automation

  • Task Automation -- Cron jobs, systemd timers, the at command, scheduling best practices, and a detailed comparison of cron with systemd timers.

Reference

For a comprehensive quick-reference covering commands across networking, file systems, processes, and more, see the Unix Toolbox. It is designed as a cheat sheet you can bookmark and return to whenever you need a command you have not used recently.

Getting Started

If you are completely new, begin with Bash Fundamentals to build a solid foundation, then move to Advanced Bash once you are comfortable with variables, loops, and conditionals. From there, branch out into whichever topic matches your current needs.

For experienced users looking to sharpen specific skills, jump directly to the relevant guide. Each page is self-contained, with cross-links to related topics where deeper context would help.

The Philosophy Behind Good Shell Scripts

Writing shell scripts is easy. Writing good shell scripts requires discipline. Here are principles that separate production-quality scripts from fragile hacks:

  1. Always use strict mode. Start every script with set -euo pipefail to catch errors early. This single line prevents entire categories of bugs.
  2. Quote your variables. Unquoted variables are the number-one source of subtle bugs in shell scripts. Word splitting and glob expansion on unquoted variables cause breakage that is difficult to diagnose.
  3. Prefer built-in constructs over external commands when performance matters, but prefer clarity over cleverness. A readable script is maintainable; a clever one is a liability.
  4. Use functions to organize logic and enable reuse. Functions make scripts testable and easier to reason about.
  5. Handle errors explicitly. Use trap to clean up temporary files and report failures. Do not assume commands will succeed.
  6. Document with comments. Future you will thank present you. Explain the why, not the what.
  7. Use ShellCheck. Run shellcheck on every script before committing. It catches quoting errors, deprecated syntax, and common pitfalls automatically.
#!/usr/bin/env bash
set -euo pipefail

readonly PROG_NAME=$(basename "$0")
readonly TEMP_DIR=$(mktemp -d)

cleanup() {
    rm -rf "$TEMP_DIR"
    echo "${PROG_NAME}: cleaned up temp files." >&2
}
trap cleanup EXIT

log() {
    echo "[$(date +%T)] $*" >&2
}

main() {
    log "Working in $TEMP_DIR"
    # ... your logic here ...
    log "Done."
}

main "$@"

This template gives you automatic cleanup, a clear entry point, structured logging, and strict error handling from the very first line. Adopt a pattern like this and your scripts will be dramatically more reliable.

Shell Scripting in the Modern Stack

Shell scripts are not relics of the past. They power Dockerfiles, GitHub Actions workflows, Makefiles, pre-commit hooks, database migration wrappers, and countless other pieces of modern infrastructure. Kubernetes operators often shell out for health checks. Terraform provisioners run shell commands. Even sophisticated deployment tools like Ansible execute shell modules under the hood.

Consider a typical CI/CD pipeline: it runs linting via a shell command, executes tests via another, builds a Docker image using a shell-based Dockerfile, and deploys with shell-driven Helm or kubectl invocations. At every stage, understanding shell scripting gives you the ability to debug failures, customize behavior, and optimize performance.

Modern infrastructure-as-code tools like Pulumi and CDK may use general-purpose languages, but they still rely on shell scripts for bootstrapping, local development, and operational runbooks. Shell scripting is not going away -- it is becoming more important as infrastructure grows in complexity.

Learning shell scripting is an investment that pays dividends across every technology domain. Dive into the guides below and start building your command-line expertise today.

Explore Shell & Scripting

Bash Fundamentals: Variables, Loops, and Conditionals

Bash fundamentals: variables, arrays, if/else, for/while loops, case statements, and basic script structure.

Advanced Bash: Arrays, Functions, and Error Handling

Advanced Bash scripting: associative arrays, functions, error handling with trap, subshells, process substitution, and debugging.

sed and awk: Text Processing Power Tools

sed and awk guide: pattern matching, substitution, field processing, one-liners, and practical text transformation examples.

Linux Command Line Productivity: tmux, aliases, and dotfiles

Boost CLI productivity: tmux sessions, shell aliases, dotfiles management, fzf, ripgrep, and terminal multiplexing.

Regular Expressions for System Administrators

Regular expressions guide for sysadmins: POSIX and PCRE syntax, grep, sed, awk, and practical pattern matching examples.

Zsh and Fish: Modern Shell Alternatives

Zsh and Fish shell guide: Oh My Zsh, Powerlevel10k, Fish abbreviations, syntax highlighting, and migrating from Bash.

Automating Tasks with cron, at, and systemd Timers

Linux task automation: cron syntax, crontab management, at for one-time jobs, and systemd timers for modern scheduling.