LIS-3353 - Computers and Thinking

Computers:
From numbers
to thinking

Simplest computer I can think of...

BINARY system. TWO possible choices


Binary System


0 and 1

..to the BINARY slides..

HUGE CONCEPT #1

All computers do is “numbers”
- you put numbers into them
- it messes with the numbers
- it gives you some numbers back

Charles Babbage


“The whole of arithmetic now appeared within the grasp of mechanism.”


The Difference Engine

..which is this





HUGE CONCEPT #1

All computers do is “numbers”

AKA COMPUTERS ARE ESSENTIALLY JUST MATH MACHINES...



HUGE CONCEPT #1

All computers do is “numbers”

(but, you can store anything in numbers)

The first “program?”

Ada Lovelace




From “Difference Engine” to “Analytical Engine”

which
'might act upon other things besides number... the
Engine might compose elaborate and scientific
pieces of music of any degree of complexity or
extent'.


What about “words?”

Let's say, we want to say, “Hi.”

ASCII

(technically not what we use today, but hey...)

H - 01001000
i – 01101001
01001000,01101001

(or really, just “72,105”. More on that later….)

Today we use, mostly


Unicode.
Yay emojis?

Images?

Sound/Music?





email?

ALL NUMBERS.
ALWAYS “CONVERTIBLE”

HUGE CONCEPT #1

All computers do is “numbers”

(but, you can store anything in numbers)

HUGE CONCEPT #1

ANYTHING IN NUMBERS
yields
HUGE(R) CONCEPT #2

HUGE(R) CONCEPT #2

All computers do is follow a very
precise list of instructions that one or
more people wrote.

Understanding Power



Computers

The smartest and dumbest things in the world.

Teaching the robots to escape

1) If there's a door in arms-reach, exit – you're done, else
2) If you can, take one step forward then goto 1), else
3) Rotate to the left until there's not a wall in front of you
then goto 1)

(this will get you out of any “regular” empty room)

An almost random bit on recursion


In computers, it's actually okay to define something with itself.
PSUEDOCODE!

(this will get you out of any “regular” empty room)


Here we go

Go to the store; if they have 2% lactose free
chocolate milk, then get me a carton.


CODE, again


That was, computers are dumb.


On to: Computers can fake being very smart

The Magic Genie

Recursion, trees, and AI.

AKA

AI is (not) extremely impressive


Let's go..


Making a computer a genius in 4 steps

guess was right (optionally, try to be general or“half-y”?)
- repeat until genius

The Magic Genie

(can be used for evil too...)


What about instead of

“Is your person a DC character?”
you ask real questions about real people?
(more on this later, but this demonstrates why
surveillance is easy and anonymity is hard.)

So then...


AI?

Alan Turing

Alan Turing

Not Alan Turing but I'll probably check out
the movie too...

How to sound smart..


“Lots of very simple instructions...

can add up to to complex computations.”
“Turing Machine (Turing Completeness)”
(an infinite tape w/ simple instructions)

“Lots of very simple instructions ...

can add up to complex computations.”
“Lambda Calculus”
(mathy way to express the above; this is
literally all you have to know)

What this really means:

Choice of "computer language" is not that important;
(IN THE ABSTRACT)
Most languages (if not specialized)
can do anything any language can do
(that's the point of computers)

What this really REALLY means:


ALL of this is VERY fluid.
That's the point of a General Purpose Machine
To hack, and to find different uses, and
to do things in different ways.

A.I. ARTIFICIAL INTELLIGENCE!

- up for debate but, history tells a lot; I'd
suggest people “move the goalposts” a lot.
“Tests”
- games like Chess
or...

The Turing Test

Simplest expression:
Could a computer (typing/chatting online)
fool a human into thinking it was a human?



Backlinks: FSU Courses:LIS3353:Raw LIS3353 Slides