Head Command -n Option PDF
Document Details
Uploaded by SmarterAbundance
Tags
Summary
This document provides notes on using command-line tools like cut, grep, cat, sort, and uniq. It covers various functionalities of each command including extracting specific elements of a string, searching for patterns, sorting text, and removing duplicate lines. The commands are used with pipelines and redirection.
Full Transcript
Head command -n option Notes for Command-Line Tools: `cut`, `grep`, `cat`, `sort`, `uniq` 1. `cut` The `cut` command extracts specific sections from each line of input, based on fields, bytes, or characters. Key Flags: `-d [delimiter]`: Specify a deli...
Head command -n option Notes for Command-Line Tools: `cut`, `grep`, `cat`, `sort`, `uniq` 1. `cut` The `cut` command extracts specific sections from each line of input, based on fields, bytes, or characters. Key Flags: `-d [delimiter]`: Specify a delimiter for splitting the line (default is tab). `-f [fields]`: Extract specific fields (e.g., `-f1,3` for fields 1 and 3). `-b [bytes]`: Extract specific byte positions (e.g., `-b1-5` for the first five bytes). `-c [characters]`: Extract specific character positions (e.g., `-c1-3` for the first three characters). Examples: bash # Extract the first column (default delimiter is tab) cut -f1 file.txt # Extract the 2nd and 4th fields from a CSV file cut -d',' -f2,4 data.csv # Extract the first 5 characters from each line cut -c1-5 file.txt With Pipes: bash # Combine with grep to search and extract fields grep "pattern" file.txt | cut -d',' -f1 2. `grep` The `grep` command searches for patterns in text files or input streams. Key Flags: `-i`: Case-insensitive search. `-v`: Invert match (exclude lines matching the pattern). `-E`: Enable extended regular expressions (like `egrep`). `-r`: Recursively search files in a directory. `-n`: Display line numbers with matching lines. `-c`: Count the number of matches. Printed using Save ChatGPT as PDF, powered by PDFCrowd HTML to PDF API. 1/5 `-o`: Show only the matching part of the lines. Examples: bash # Find lines containing "error" (case-insensitive) grep -i "error" log.txt # Exclude lines containing "warning" grep -v "warning" log.txt # Display matching lines with line numbers grep -n "error" log.txt With Pipes: bash # Search for a pattern in the output of another command cat file.txt | grep "search term" 3. `cat` The `cat` command is used to display, concatenate, or redirect file contents. Key Flags: `-n`: Number all lines. `-b`: Number non-empty lines only. `-s`: Suppress repeated empty lines. `-E`: Display `$` at the end of each line (to visualize line endings). Examples: bash # Display the contents of a file cat file.txt # Concatenate two files into one cat file1.txt file2.txt > combined.txt # Display file contents with line numbers cat -n file.txt With Pipes: bash # Combine files and search for a pattern cat file1.txt file2.txt | grep "pattern" Printed using Save ChatGPT as PDF, powered by PDFCrowd HTML to PDF API. 2/5 4. `sort` The `sort` command sorts lines of text files or input. Key Flags: `-r`: Sort in reverse order. `-n`: Sort numerically (e.g., `1`, `2`, `10` instead of `1`, `10`, `2`). `-u`: Remove duplicates (unique lines). `-t [delimiter]`: Use a specific delimiter for sorting fields. `-k [key]`: Sort based on a specific field (e.g., `-k2` for the second field). Examples: bash # Sort lines alphabetically sort file.txt # Sort numerically sort -n numbers.txt # Sort by the second column (fields separated by commas) sort -t',' -k2 file.csv With Pipes: bash # Sort output from another command grep "pattern" file.txt | sort -r 5. `uniq` The `uniq` command filters out or processes adjacent duplicate lines. Key Flags: `-c`: Count occurrences of unique lines. `-d`: Only print duplicate lines. `-u`: Only print unique lines. `-i`: Ignore case when comparing lines. Examples: bash # Remove duplicate lines (adjacent) uniq file.txt # Count occurrences of each line uniq -c file.txt # Print only duplicate lines uniq -d file.txt Printed using Save ChatGPT as PDF, powered by PDFCrowd HTML to PDF API. 3/5 With Pipes: bash # Sort lines first, then remove duplicates sort file.txt | uniq Combining Tools with Pipes and Redirection The real power of these commands is seen when used together in pipelines. Example: Process CSV File bash # Extract the first column, sort it, and count occurrences of each value cut -d',' -f1 data.csv | sort | uniq -c Example: Search, Sort, and Filter Data bash # Find lines containing "error", sort them, and remove duplicates grep "error" log.txt | sort | uniq Example: Redirect Output to a File bash # Save the sorted and unique lines into a new file cat file.txt | sort | uniq > output.txt Summary Table of Commands and Flags Command Key Flags Purpose `-d [delimiter]`, `-f [fields]`, `-b Extract fields, bytes, or characters from each `cut` [bytes]`, `-c [characters]` line of input. Search for patterns using regular `grep` `-i`, `-v`, `-E`, `-r`, `-n`, `-c`, `-o` expressions. Display, concatenate, or redirect file `cat` `-n`, `-b`, `-s`, `-E` contents. `-r`, `-n`, `-u`, `-t [delimiter]`, `-k Sort lines of text alphabetically, numerically, `sort` [key]` or by fields. Remove adjacent duplicates, count `uniq` `-c`, `-d`, `-u`, `-i` occurrences, or filter unique lines. Printed using Save ChatGPT as PDF, powered by PDFCrowd HTML to PDF API. 4/5 These tools, combined with pipes (`|`) and redirection (`>` or `>>`), form the backbone of shell-based text processing! Printed using Save ChatGPT as PDF, powered by PDFCrowd HTML to PDF API. 5/5