Friday, September 30, 2016



Some questions posted by the class after Lecture 1, September 27, 2016

•    I know that every letters or numbers can be expressed by 0,1, but how can they not mess up when combing together? For example a letter is expressed as 001, and another is expressed as 110; together it is 001110. How can you say whether it is 001/110 or 0011/10?
•    Why did it took so long to create the first computer based on that computing was created during the 17th century?
•    How similar is the nature of human's mind to that of a program?
•    How are humans and other biological computers programmed in similar ways to a typical electronic computer system?
•    Why must a computer read data only in terms of two symbols? Why must it use a simple formal language rather than a natural one?
•    To what extent is computing and the computing thought process used with regard to further research in artificial intelligence?
•    How does defining and computing things using only two digits work, and how was it discovered?
•    Can biological computers be thought of as transformers as well, due to their ability to adapt and evolve to the needs demanded by their environment?
•    Can you make a computer more "programmable" even after it is constructed? Like memory to download more software, applications, features?
•    Can the programmability of different computing devices differ? And if so, what determines a computer’s programmability? Is it the hardware, software, programmer, user, or none of the above?
•    How often do computations make better choices than humans? Or is the human variable beneficial in computing in some equations?
•    How can we compare the way a human processes information to the way a computer does?
•    How does biological computing work?
•    Where does a machine end, and a computer begin?
•    With the invention of artificial intelligence, will computers still have limitations since they have the ability to learn?
•    Computing existed since a long time ago, but how did it change throughout history?
•    You said that integers are a constant regardless of human relativity, but I wonder how they could possibly be a constant or anything at all without relativity?

Saturday, September 3, 2016

Greetings!
This will be the blog for the Freshman Seminar "What is Computing" aka int94th at UCSB, Fall 2016.

But isn't computing about crunching numbers? Big numbers? Lots of numbers? Is this not really doing arithmetic fast?  Really fast? Or is it cellphones? Airline reservations? Guiding rockets? Financial transactions? Video games? How about "...listen carefully for our menu has recently changed.."? Writing long "codes" perhaps? Weather prediction? Solving the Rubik's cube? Digital photography? The Internet? Or are all of these engineering aspects of a very central human endeavor? The answer is THEY ARE ... but how can we define Computing and make the definition precise, palatable and useful at the same time?

We will talk about what Computing is, and argue that it is a basic constant of the world-universe-cosmos, like the chemical elements and the integers (OK, even this is arguable!). It is a way of thinking, and a way of "measuring" difficulty of the decisions we make, the way we organize objects and quantify the behavior of things around us. But there is more...

The students will be required to think about each lecture/discussion (there are 10 of these), summarize it in a paragraph, come up with a novel question about the topic, and send these to me.

-OE