Week 7 - Linux and The Command Line

....cybersecurity edition

Try to remember..

At a point in history, this was the ONLY
way to interact with the computer

AND

The "users" were the "programmers"
NO HAND HOLDING

Also — "The Unix Way"

The First Draft...

  1. Make each program do one thing well. To do a new job, build afresh rather than complicate old programs by adding new "features".
  2. Expect the output of every program to become the input to another, as yet unknown, program. Don't clutter output with extraneous information. Avoid stringently columnar or binary input formats. Don't insist on interactive input.
  3. Design and build software, even operating systems, to be tried early, ideally within weeks. Don't hesitate to throw away the clumsy parts and rebuild them.
  4. Use tools in preference to unskilled help to lighten a programming task, even if you have to detour to build the tools and expect to throw some of them out after you've finished using them.

Shell Scripting - Command line

That default thing that comes up on all the unixy-linuxy systems everywhere.

It’s a text interface. You type commands into it and the computer responds.
And it’s also a "programming" language. As in, you can type in more than one command in a row, save it to a file, and run the file. So, you know, "programming."

Names of things

Users and Permissions..

..actually mean something today

ROOT – Like “Administrator” or maybe “God”
users – humans
(..and others – fake “users” to get tasks done)

Some systems (eg Ubuntu) allow for Super Users
S.U.- do “this” = sudo

CLASSIC COMEDY

Permissions

Three major things you can do with files

Three important “groups”

Permissions for Directories

..are weird

(create new/delete existing files, or rename them)

Practical Permission Problems

Commands in theory

Any IMPERATIVE action the computer can do. Can be one word or more.

Ultimately, will be an ORDER, usually expressible as a VERB

Are VERY closely related to (if not identical) to FUNCTIONS/METHODS

"Computer! Do THIS!"

ls

Commands in theory

Since we're in the command line we are always acting on:

FILES and/or TEXT. These will be input and/or output.

if commands are VERBS, the FILES and TEXT are the nouns/objects
We call these expressions

(and of course, the TEXT can lead you to something else, like a FOLDER)

cat file.txt
echo "Hi there"
ls "/home/mine"

Nearly every command can act on either TEXT or FILES or BOTH.

Commands in Theory

We've talked VERBS and NOUNS. But we might want to modify the operation of things;
Think ADVERBS and/or ADJECTIVES:

On the command line, these are called options

one dash + letter (ls -a)
two dashes + words (sort --reverse)

Getting Help

but seriously, Google/Duckduckgo etc

File Manipulation

Viewing Text and Files

cat - "Good" example of "efficiency" at the expense of "redundancy"
i.e. it means "concatenate" — which is to squish two files together and print to the screen. But it can also do it with just one file.

less - this is such a terribly bad joke I hate even explaning it

Let's slow down here,

because here is the power:

One way to describe cat - It "shows you the file"

BUT, let's be VERY precise here:
Cat TAKES A LINE OF TEXT (that refers to a file)..
and PRINTS IT ON THE SCREEN

TAKES A LINE OF TEXT = "Standard Input or stdin"

PRINTS IT ON THE SCREEN "Standard Output or stdout"

Pipes and Redirects

Default is to read from stdin, and write to stdout.
But by changing the default NOW YOU'RE PLAYING WITH POWER

(interesting then , cat goes from FILE to TEXT, and > goes from TEXT to FILE )

BIG OVERARCHING POINT..

THAT'S MY OPINION

If it works and its clearer to you, don't let the supernerds tell you it's a bad idea e.g.
"Useless use of cat" IS FINE

BASH

BASH (Bourne Again) Shell - others are fish and zsh, etc

Lots of “tricks” are available here, eg

and many MANY more

BASH

Furthermore, you can modify this environment to fit your needs, via:
.bashrc
(stuff here will be run everytime you open a terminal)

A great example is the “alias” command. If a command doesn't exist for what you want to do, just ,ake up your own!

alias modbash='nano ~/.bashrc'

Viewing Files

IN TERMINAL

ALSO

Opening Files

COMMAND/ARGUMENT STYLE

Cybersecurity and Forensics

Thinking about these tools re: cybersecurity, you're likely not looking at prose or code, but "data," often tabular or otherwise "organized."

Keeping it simple at first:

SORT

sorting text

GREP

searching text for matches

grep OPTIONS PATTERN (FILE)
Can search over FILES or STDIN
Also, can search ONE FILE or MANY (check -d or -R)
useful flags:
-i (case insensitive)
-v (invert search/show NON-matches)
-l (just show matching FILES, not lines)

(see also "ripgrep" or rg)

FIND

Searches directory tree rooted at given filename (default current)
Good if you also want to use parameters like “date”, “last accessed”, “size” and so forth.
Often used with -name or -iname
Also, consider “locate” (database must be setup beforehand)

This was just "searching"

but what if we want to change the text?

Remember, this is relatively easy and non-destructive by default; most of the time we're NOT changing the file in place, we're printing to stdout and optionally saving that output:

First, the granddaddies:

SED and AWK

You can do A LOT with these, they're basically languages in their own right. They're a little difficult, especially AWK.

SED

REGULAR EXPRESSIONS

echo “Good day” | sed 's/day/night/'
http://www.grymoire.com/Unix/Sed.html
http://sed.sourceforge.net/sed1line.txt

AWK

awk <search pattern> {<program actions>}
Also a text-processor, good for flat-file databases
Also, an entire language

awk ' /apples/ { print $2 “ “ $1 } '

but, some of my go to stuff

tr

Transliterate, i.e.
CHANGE a character to another
(yes, this is how I did JOHN IS RAD)
tr [a-z] [A-Z]

cut

cut a string according to, e.g. fields

(this is my favorite. I just find it way more intuitive than awk/sed)

-f = which field or fields and optionally
-d = change the delimiter

e.g. to get the last name:

cut -f2 -d " "
-> ...combine with the following

wc

Is for "word count" — but it can do newline and byte counts.

Since bash can do a lot of "by line" stuff, wc -l
might be valuable

(again, a lot of these tools have "count" built in, but I find this easy to remember)

FILE

hey, it's a command. Can tell you about a file

Process file line by line:

Cybersec specific?

exiftool 

- (grab data from pictures)
strings - (Look for human readable strings in anything)
zipping and unzipping generally (try it on a docx or odt ☺ )