Difference between revisions of "Unix-like Operating Systems"
Jump to navigation
Jump to search
(7 intermediate revisions by 2 users not shown) | |||
Line 3: | Line 3: | ||
** [https://www.if-not-true-then-false.com/2012/howto-change-runlevel-on-grub2/ Directions here] | ** [https://www.if-not-true-then-false.com/2012/howto-change-runlevel-on-grub2/ Directions here] | ||
** put a 3 on the end of the line, then boot | ** put a 3 on the end of the line, then boot | ||
− | * md5sum -c *.md5 | + | * <code>md5sum -c *.md5</code> |
** when you have file_whatever.iso and a file_whatever.iso.md5, and you want to check | ** when you have file_whatever.iso and a file_whatever.iso.md5, and you want to check | ||
* file = determine file type and output that to STDOUT | * file = determine file type and output that to STDOUT | ||
* Here Documents - writing files via command line | * Here Documents - writing files via command line | ||
** enclose delimiter in single quotes if you want to turn off backtick, $variable and arithmetic expansion | ** enclose delimiter in single quotes if you want to turn off backtick, $variable and arithmetic expansion | ||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
** If the redirection operator is `<<-', then all leading tab characters are stripped from input lines and the line containing DELIMITER. This allows here-documents within shell scripts to be indented in a natural fashion. | ** If the redirection operator is `<<-', then all leading tab characters are stripped from input lines and the line containing DELIMITER. This allows here-documents within shell scripts to be indented in a natural fashion. | ||
+ | cat > file_you_want_to_save_text_to.txt <<'EOF' | ||
+ | bunch | ||
+ | of | ||
+ | lines | ||
+ | and | ||
+ | metacharacters ! # ? * " | ||
+ | it's all good | ||
+ | EOF | ||
+ | |||
* [[gprof]] | * [[gprof]] | ||
* tar -h option dereferences symbolic links and you get the pointed-to file not the symlink | * tar -h option dereferences symbolic links and you get the pointed-to file not the symlink | ||
* Get a list of all files in a tarball: | * Get a list of all files in a tarball: | ||
− | ** tar -tzf backup.tar.gz (GNU tar) | + | ** <code>tar -tzf backup.tar.gz</code> (GNU tar) |
− | ** lz backup.tar.gz | + | ** <code>lz backup.tar.gz</code> |
− | * | + | * <code>tar zcvf - ./ | ssh user@host "cat > "</code> - "tar over ssh" |
* [[puppetd]] | * [[puppetd]] | ||
* pgrep <string> - return the PIDs of any process whose name greps with "string" | * pgrep <string> - return the PIDs of any process whose name greps with "string" | ||
Line 40: | Line 39: | ||
* [[screen]] | * [[screen]] | ||
* [[SSH and SSH-Keygen]] | * [[SSH and SSH-Keygen]] | ||
− | * PS1="\u@\h \w\n$ " ... user@host pwd newline $prompt | + | * BASH terminal prompts |
− | * PS1="\n\[\e[32m\]\u@\h \[\e[33m\]\w\[\e[0m\]\n\$" | + | ** <code>PS1="\u@\h \w\n$ " </code>... user@host pwd newline $prompt |
+ | ** <code>PS1="\n\[\e[32m\]\u@\h \[\e[33m\]\w\[\e[0m\]\n\$"</code> | ||
* [[date]] | * [[date]] | ||
* [[cron]] | * [[cron]] | ||
* echo "hello user2 EOF" | write user2 = send a one-line message to user2 logged in on a machine. | * echo "hello user2 EOF" | write user2 = send a one-line message to user2 logged in on a machine. | ||
* wall message => send a message to everybody | * wall message => send a message to everybody | ||
− | * | + | * <code>echo $[ 34 * 11 ]</code> - command line math |
− | < | ||
− | |||
− | </ | ||
* [[patch]] | * [[patch]] | ||
− | |||
** the following command will copy the list of files and directories which start with an uppercase letter in the current directory to destdir: /bin/ls -1d [A-Z]* | xargs -J % cp -rp % destdir | ** the following command will copy the list of files and directories which start with an uppercase letter in the current directory to destdir: /bin/ls -1d [A-Z]* | xargs -J % cp -rp % destdir | ||
− | ** implemented differently between gnu and bsd | + | ** implemented differently between gnu and bsd (everyone |
* [[find]] | * [[find]] | ||
* [[httrack]] | * [[httrack]] | ||
* [[ctags]] | * [[ctags]] | ||
− | * tidy - | + | * <code>tidy -indent -wrap 0 index.html > index_new.html</code> |
+ | ** wrap 0 means don't wrap long lines | ||
* [[mount]] | * [[mount]] | ||
* [[nfs]] | * [[nfs]] | ||
Line 65: | Line 62: | ||
* [[indent]] | * [[indent]] | ||
* sudo su <user> | * sudo su <user> | ||
− | * chmod a+rwX = | + | * chmod |
+ | ** example: a+rwX | ||
+ | ** u = user (file owner) | ||
+ | ** g = group | ||
+ | ** o = other (randos!) | ||
+ | ** a = all users | ||
+ | ** r = read | ||
+ | ** w = write | ||
+ | ** x = execute & search for directories | ||
+ | ** X = execute/search only if the file is a directory or already has execute permission for some user | ||
+ | ** s = set user or group ID on execution | ||
+ | ** t = restricted deletion flag or sticky bit | ||
* [[grep]] | * [[grep]] | ||
* [[top]] | * [[top]] | ||
Line 83: | Line 91: | ||
* sort -u | * sort -u | ||
* tee - Like a T-shaped pipe joint. | * tee - Like a T-shaped pipe joint. | ||
+ | * curl | ||
+ | ** writes to stdout by default | ||
+ | ** -O save the file with the same name as in the URL. | ||
+ | ** -o new_file_name for renaming | ||
+ | ** "-C - ": checks if part of the file has already been downloaded. If the file exists and is complete, curl recognizes this and doesn't download the file again. If the file is partially downloaded, curl will attempt to resume the download from where it left off. | ||
+ | ** -Z download multiple files in parallel | ||
+ | ** -k --insecure | ||
==Ghetto Parallelism== | ==Ghetto Parallelism== | ||
+ | |||
===Using split=== | ===Using split=== | ||
* <code>split --number=l/6 -d --additional-suffix=.txt filelist.txt filelist</code> | * <code>split --number=l/6 -d --additional-suffix=.txt filelist.txt filelist</code> | ||
Line 105: | Line 121: | ||
} | } | ||
</pre> | </pre> | ||
+ | |||
===Using xargs=== | ===Using xargs=== | ||
+ | * <code>... | xargs --verbose -i ...\;</code> | ||
+ | ** Default placeholder string token is <code>{}</code>, sometimes I will change it to <code>'{}'</code> | ||
* <code>find . -name "*.png" | xargs -l --max-procs=30 ./convert_to_tiff.sh</code> | * <code>find . -name "*.png" | xargs -l --max-procs=30 ./convert_to_tiff.sh</code> | ||
** that's the letter l not 1 | ** that's the letter l not 1 | ||
− | + | ||
− | # convert_to_tiff.sh | + | # convert_to_tiff.sh |
− | ORIGNAME=$1 | + | ORIGNAME=$1 |
− | DIRNAME=$(dirname $ORIGNAME) | + | DIRNAME=$(dirname $ORIGNAME) |
− | BASENAME=$(basename $ORIGNAME .png) | + | BASENAME=$(basename $ORIGNAME .png) |
− | NEWNAME="../lanczos_tiff/$DIRNAME/$BASENAME.tif" | + | NEWNAME="../lanczos_tiff/$DIRNAME/$BASENAME.tif" |
− | echo "$ORIGNAME -> $NEWNAME" | + | echo "$ORIGNAME -> $NEWNAME" |
− | convert -verbose $ORIGNAME $NEWNAME | + | convert -verbose $ORIGNAME $NEWNAME |
− | + | ||
+ | |||
==System Monitoring== | ==System Monitoring== | ||
+ | |||
* <code>mpstat -P ALL 5 | tee profile/test2</code> - givers per core usage stats every 5 seconds | * <code>mpstat -P ALL 5 | tee profile/test2</code> - givers per core usage stats every 5 seconds | ||
** %usr %nice %sys %iowait %irq %soft %steal %guest %gnice %idle | ** %usr %nice %sys %iowait %irq %soft %steal %guest %gnice %idle | ||
+ | * strace, ltrace | ||
+ | * lsof | ||
+ | |||
+ | == wget == | ||
+ | * <code>wget --execute="robots = off" --mirror --convert-links --no-parent --wait=5 <URL></code> - mirror a site locally | ||
+ | * <code>-r</code> retrieve recursively | ||
+ | * <code>--no-parent, -np</code> - Do not ever ascend to the parent directory when retrieving recursively. This is a useful option, since it guarantees that only the files below a certain hierarchy will be downloaded. | ||
+ | * <code>--no-directories, -nd</code> - Do not create a hierarchy of directories when retrieving recursively. With this option turned on, all files will get saved to the current directory, without clobbering | ||
+ | * <code>--timestamping, -N</code> - Turn on time-stamping | ||
+ | * <code>--level=depth, -l</code> - Set the maximum number of subdirectories that Wget will recurse into to depth. | ||
+ | * <code>--mirror</code> - equivalent to <code>-r -N -l inf --no-remove-listing</code> (keeps FTP directory listings.) | ||
+ | * <code>--accept-regex urlregex</code> | ||
+ | * <code>--quiet, -q</code> - used if your output needs to be free of info messages | ||
+ | === Examples === | ||
+ | * <code>wget -nc -np -r -A "*.xml" -l 2 https://rockyweb.usgs.gov/vdelivery/Datasets/Staged/Elevation/LPC/Project</code> |
Latest revision as of 13:26, 18 April 2024
Contents
General
- grub boot, change to run level 3
- Directions here
- put a 3 on the end of the line, then boot
md5sum -c *.md5
- when you have file_whatever.iso and a file_whatever.iso.md5, and you want to check
- file = determine file type and output that to STDOUT
- Here Documents - writing files via command line
- enclose delimiter in single quotes if you want to turn off backtick, $variable and arithmetic expansion
- If the redirection operator is `<<-', then all leading tab characters are stripped from input lines and the line containing DELIMITER. This allows here-documents within shell scripts to be indented in a natural fashion.
cat > file_you_want_to_save_text_to.txt <<'EOF' bunch of lines and metacharacters ! # ? * " it's all good EOF
- gprof
- tar -h option dereferences symbolic links and you get the pointed-to file not the symlink
- Get a list of all files in a tarball:
tar -tzf backup.tar.gz
(GNU tar)lz backup.tar.gz
tar zcvf - ./ | ssh user@host "cat > "
- "tar over ssh"- puppetd
- pgrep <string> - return the PIDs of any process whose name greps with "string"
- test
- Unix-like System Startup
- history | grep <search string>; !<history number>
- sed
- gawk - GNU awk
- diff
- Unix Run Processes at Startup
- Unix Network Administration
- umask
- install
- gunzip ... unzip ... uncompress
- screen
- SSH and SSH-Keygen
- BASH terminal prompts
PS1="\u@\h \w\n$ "
... user@host pwd newline $promptPS1="\n\[\e[32m\]\u@\h \[\e[33m\]\w\[\e[0m\]\n\$"
- date
- cron
- echo "hello user2 EOF" | write user2 = send a one-line message to user2 logged in on a machine.
- wall message => send a message to everybody
echo $[ 34 * 11 ]
- command line math- patch
- the following command will copy the list of files and directories which start with an uppercase letter in the current directory to destdir: /bin/ls -1d [A-Z]* | xargs -J % cp -rp % destdir
- implemented differently between gnu and bsd (everyone
- find
- httrack
- ctags
tidy -indent -wrap 0 index.html > index_new.html
- wrap 0 means don't wrap long lines
- mount
- nfs
- ps - process server
- rsync
- C-z to stop, jobs to list background processes, fg %1, fg %2 depending on process number
- indent
- sudo su <user>
- chmod
- example: a+rwX
- u = user (file owner)
- g = group
- o = other (randos!)
- a = all users
- r = read
- w = write
- x = execute & search for directories
- X = execute/search only if the file is a directory or already has execute permission for some user
- s = set user or group ID on execution
- t = restricted deletion flag or sticky bit
- grep
- top
- df = report file system disk space usage... "macroscopic" view of storage devices
- du -h --max-depth=1 = How big are all the files and directories in this directory... "microscopic" view of file space usage in a specific directory.
- Passing the STDERR to the bit bucket 2>/dev/null
- 0, 1, and 2 = STDIN, STDOUT, and STDERR, respectively
- By default, if you don’t name or number one explicitly, you’re talking about STDOUT
- > is output redirection to a file
- >> Appends to output to a file, rather than clear out the file
- >& Redirect both regular and error output to a file
- proc directory
- lsof
- bash shell keyboard shortcuts
- basename and dirname
- sort -u
- tee - Like a T-shaped pipe joint.
- curl
- writes to stdout by default
- -O save the file with the same name as in the URL.
- -o new_file_name for renaming
- "-C - ": checks if part of the file has already been downloaded. If the file exists and is complete, curl recognizes this and doesn't download the file again. If the file is partially downloaded, curl will attempt to resume the download from where it left off.
- -Z download multiple files in parallel
- -k --insecure
Ghetto Parallelism
Using split
split --number=l/6 -d --additional-suffix=.txt filelist.txt filelist
function parallelize { set -x local input_file=$1 # the file.txt containing input lines to be split shift local n_splits=$1 # the number of chunks/instances of command to run shift echo "input: $input_file" echo "nsplits: $n_splits" local base=`basename $input_file .txt` echo $base local suffix="_${n_splits}_chunk.txt" split --numeric-suffixes --number=l/$n_splits --additional-suffix=$suffix $input_file $base # no mid-line splits set +x }
Using xargs
... | xargs --verbose -i ...\;
- Default placeholder string token is
{}
, sometimes I will change it to'{}'
- Default placeholder string token is
find . -name "*.png" | xargs -l --max-procs=30 ./convert_to_tiff.sh
- that's the letter l not 1
# convert_to_tiff.sh ORIGNAME=$1 DIRNAME=$(dirname $ORIGNAME) BASENAME=$(basename $ORIGNAME .png) NEWNAME="../lanczos_tiff/$DIRNAME/$BASENAME.tif" echo "$ORIGNAME -> $NEWNAME" convert -verbose $ORIGNAME $NEWNAME
System Monitoring
mpstat -P ALL 5 | tee profile/test2
- givers per core usage stats every 5 seconds- %usr %nice %sys %iowait %irq %soft %steal %guest %gnice %idle
- strace, ltrace
- lsof
wget
wget --execute="robots = off" --mirror --convert-links --no-parent --wait=5 <URL>
- mirror a site locally-r
retrieve recursively--no-parent, -np
- Do not ever ascend to the parent directory when retrieving recursively. This is a useful option, since it guarantees that only the files below a certain hierarchy will be downloaded.--no-directories, -nd
- Do not create a hierarchy of directories when retrieving recursively. With this option turned on, all files will get saved to the current directory, without clobbering--timestamping, -N
- Turn on time-stamping--level=depth, -l
- Set the maximum number of subdirectories that Wget will recurse into to depth.--mirror
- equivalent to-r -N -l inf --no-remove-listing
(keeps FTP directory listings.)--accept-regex urlregex
--quiet, -q
- used if your output needs to be free of info messages
Examples
wget -nc -np -r -A "*.xml" -l 2 https://rockyweb.usgs.gov/vdelivery/Datasets/Staged/Elevation/LPC/Project