Think it's hard? Think again. It's really not so bad...
Well, perhaps some aspects of it can be terrible, like sitting in front of a computer all day while the sun is shining and the waves are thrashing,...
Basically, all it takes to be a good programmer is patience, attention to detail, and an ability to imagine and think about logical scenarios. Patience because it can take a long time and quite a lot of frustration to make a computer do as you wish it would do. Attention to detail because computers are super detail-oriented, and you need to be as well in order to properly instruct it. And be able to use logic because the computer is strictly logical. But if you don't have much practice with logic, don't worry about, you'll get plenty of practice learning how to program!
To be a great programmer one needs all of the above, but also an ability to socialize and understand other people. This is important for two reasons: most software is developed in teams, and most people who buy software aren't very good at describing exactly what it is that they want. Sounds crazy, but it's true! A great programmer needs to be able to understand what the customer truly wants, probably better than the customer can articulate herself!
Here I'm going to teach Python. Why Python? Well, for starters, I think it's one of the easiest languages to learn. It's also very elegant, which is important when programming (more on that later.) But mostly because I like it, and I think most people will too. Besides, once you learn one language, learning other languages gets a lot easier... so who cares?!
But before we get into all the details... I want to give you a quick overview to what programming is all about. In this post, you won't learn how to actually program. Instead, I want to give you an overview of what it even means to program, and what it means to have a computer actually run your program...
Information and Actions.
The above is actual Python code. Can you guess what it does?
Yup! This causes "howdy do!" to be displayed on the screen. Here, the piece of information involved is the text "howdy do!", while the action is to print it to the screen.
But more generally, information is basically what it sounds like. It can be numbers, words, pictures, songs, colors, types of cars, brand names, people's age, and what not.
Actions are things that computers can do, almost always involving information: checking information, reading information, writing information... Things that don't involve information usually aren't important to computer programmers. An electrical engineer may care about how to physically make electricity turn into light on the screen, but a computer programmer just takes for granted that such a magical device exists, and gives it the information it needs to display something on it.
Hard to believe, but that's pretty much all computer programming is all about... well, sort of...
A world of limitations
As you probably have guessed, computers are actual devices that exist in physical space-time. As such, they're not all powerful and have earthly limitations. To better understand these limitations, we'll have to go into strange geeky-sounding topics such as "memory" and "CPU" and "bits" and "bytes". I'll try to go easy here, but it's kind of like taking off a bandage: you do it, it hurts, and then you end up regretting having done so and immediately put on another one. So lets get this thing over with already:
Information is generally limited by memory. What does this mean? Well, the smallest piece of information that a computer has is called a bit. A bit, in physical terms, is basically something that you'll have to ask an electrical engineer if you're curious. But to us computer programmers it's basically "one of two values".
"One of what?" you ask. It's symbolic, meaning it can be whatever you want it to be. It can be either "1" or "0". Or "true" or "false". Or "green" or "blue". Or "left" or "right". Or "right" or "wrong". Or... You get it. Typically it's regarded as either "true" or "false", or "1" or "0", but that's just typically.
Assignment 1.1: think of a few things that a bit can symbolize.
Well, how useful is one of two values? Well, it's useful, but not crazy useful. But fear not! If, for instance, you want "one of four values", you can take two bits and have 2 * 2 = 4 possible values! How AWESOME is that?! What if you want "one of eight values?" Well, now you need three bits and you have 2 * 2 * 2 = 8 possible values!
Crazy stuff, I know. Typically, bits are grouped together in standard group sizes to be useful. For instance, there is a thing called a byte, which is eight bits strung together, which give you 256 possible values! WOW! That can be a number from, say, 0 to 255, which could probably represent my life expectancy, or the number of cars that I own, or the number of kids that I have, or... But it can also be one of the 7 colors in the rainbow, or one of fifty states of America, or...
Assignment 1.2: think of a few things that a byte can symbolize.Would you believe me that sometimes, two bytes, or 16 bits, are grouped together, and are called something... That makes, if my calculator is right, one of 65,536 values! And of course, there are four bytes, or 32 bits, which make one of 4,294,967,296 values! Well, you get the hint: a single bit, although a little wimpy on its own, can be grouped together with other bits, and a lot of information can be represented.
Assignment 1.3: how many values can be represented by 64 bits?Wow, what a tangent! I didn't even get to talk about the limitation problem here. So lets talk about it now. Computers are limited by a set number of bits, or bytes, or whatever. This limits us in two main ways:
In the "big picture" we are limited by the amount of total memory, or space, that we have. For instance, if you have 4 gigs (4 billion bytes) of memory in your computer, that's about all the bytes that your computer can deal with... It may have access to more information, such as information on you disk drive, or on the internet, or even to the information in your brain (if you type it in the keyboard), but at any given moment, the computer will only be able to deal with 4 billion bytes of information. That's a lot, especially for most stuff that you will want to do, like play Tetris or something, but it's not infinite, and it's not enough for some things.
Also, in the "small picture" we are limited by memory on a smaller scale. For instance, say you choose to represent the money you have (in a whole dollar amount) in your wallet with eight bits, or a byte. And say, this byte will represent values between $0 and $255 (that's a total of 256 possible values.) Now say you have $100 in the wallet now, and to it, you add another $100. You end up with $200, right? And since that's within your "$0 to $255" range, all is OK. But what happens if you add another $100? You should end up with $300, but guess what?! $300 is not in your $0 to $255 range!!
So what would actually happen? The computer will explode, the universe will implode, and life as we know it will end. Seriously. OR, more likely, you'll end up with some other value, that's within the correct range, but isn't conceptually correct, like the number $45. Yup, if you asked the computer to add $100 to $200, and you have limited it to the $0 to $255 range, the computer may have it equal $45. Or something else, depending on the computer. Pretty dumb, right? Well, computers are dumb. But more on that later.
For now, it's only important to realize that computers are limited in the way they can deal with information. BTW, one of the advantages of Python is that it sort of "takes care" of a lot of the "small picture" limitation problems for you automatically. It sort of deals with the annoying details, so you don't have to. That's nice.
BUT WAIT! There's ANOTHER important limitation, and that's time. Computers may be fast, but they're not infinitely fast. Telling a computer to add two numbers, or to perform any other action will take it some time to actually perform. How fast depends on the computer itself. Generally the faster the "CPU" you have, the less time it will take a computer to perform a particular action.
And of course, some things take longer than others. Adding two numbers may take a fraction of a millisecond, but analyzing all possible routes that you can take to get home from work may take it more time than the sun has left to burn.
So how do navigation programs work? Well, they have smart programmers who instead of asking the computer to analyze all possible routes, have programmed a few "shortcuts" or "tricks" or "heuristics" or "artificial intelligence" or whatever you want to call the same thing basically, and have the computer only analyze some of the possible routes, while completely ignoring others. This is not a perfect solution, but it usually works fairly well, and you get home alright in the end.
I don't want to get into specific "shortcuts" or "heuristics" at this point, but it's just important for you to realize that computers are limited in not only in space but also in time.
So that's it! As a computer programmer, you need to know that computers deal with information, perform actions on information, but do so in a limited (yet powerful) capacity.
Computers are Dumb. Really Dumb.
So dumb that you can't even use the word "dumb" to describe how dumb they are. They are mindless robots following your instruction perfectly, and to the letter. Yes, you're used to Google's "auto-complete" feature which "reads your mind," or your amazing spell checker, or whatever the heck you use your computer for that makes you believe that computers are awesome and smart. Well, they may be awesome, but they're definitely not smart.
And here, and now, I will give you the most important lesson in computer programming:
And I quote myself here when I write: computers are really dumb, and if they do something "wrong," it's YOUR, the programmer's, fault.A computer never does anything wrong. It does everything that it's programmed to do, and does it exactly right. It's you, you you you you you you you, who programmed it wrong.
It takes most beginner programmers about a year to internalize this. They may yell at the computer, they may even smash the computer. They may cry all day and cry all night. But nothing can ever help them until they realize that it is they who are the source of their misery.
Assignment 1.4: blame yourself for something you did wrong today.
Assignment 1.5: look at the computer you're using and tell it: "I know you can't hear me, but you're dumb, you're really really dumb."
To better help you apply concepts that you learn, I'm giving you a little "side-project" to actually write the game Minesweeper from scratch! Of course, it's going to take all the lessons in this tutorial to complete a beautiful, graphic version of the game. But at the end of every lesson, I'll give you a guided assignment in order to move you along the path to a completed game! So let's start now! :)
Before any programming assignment, one needs to be familiar with the task at hand. In our case, it means you gotta go play a little Minesweeper to get a hang for the game! So, if you're not at all familiar with it (seriously?), Minesweeper is a little game that challenges you to find all the mines in a minefield. But beware, if you click on an area with a mine, you lose and need to start over! To help you out, you can click on areas of the map, and it tells you how many mines are in adjacent cells (the top, bottom, left, right, or adjacent squares). You should play with the game a little, just to get familiar with it:
- You can play with it online here: minesweeperonline.com
- And of course, you can read about it here: wikipedia.org.
Assignment 1.6: go play a little Minesweeper. :)
Next time I'll go into other things, and perhaps you'll even get a taste of actual programming. Yay!
Continue with the next lesson: Playing with Information
So here is a riddle for you...
2, 47, 3, 20, 34, 53, 23, 20, 59, 12, 23, 20, 99, 23, 73, 59
2 --> y
3 --> u
12 --> h
20 --> [space]
23 --> e
34 --> a
47 --> o
53 --> r
59 --> t
73 --> s
Can you figure out what 99 is? :)