Recently, I reread a few articles about how terrible PHP is. I read them early on in the summer, when I started my first real programming job, working in - you guessed it - PHP! A friend of mine (who's currently working on his PhD in CS, go Ed!) warned me that my coding skill would decrease as I worked with PHP. I was a bit worried. I wanted to be a good programmer*.
After four and a half months of intense learning, coding, and reading, I came back. Were the articles right? Did my coding skill die? I think I'm getting ahead of myself.
Reading the comments (always a delight) lead me to two articles that disagreed on different fronts. The first agrees. PHP is annoyingly bad, but we use it anyway because it's so easy to use. The other article takes a different approach: rather than focusing how irritatingly bad PHP is, it looks at what us web developers really need. According to the author, PHP's the only language/framework that's got it where it counts.
All of the articles above are good reads, written by smart people, but I can't help but feel that all of them are missing the point. I do my work in PHP, get the job done. Awesome. Here's the problem: for the past few weeks, I've been working on something that shouldn't exist. Our next project deals with an API. This API is so horribly bad that I have to write another layer around it so that our devs don't go insane dealing with Codethulu. An API for an API, if you will.
This sort of thing shouldn't happen. As my professor says "You can be profane in any language." We should worry less about strange things happening with equality operators and more on creating good code. Not just creating good code by ourselves, but setting an example for our teams, talking to our bosses about why outsourcing is (usually) bad, and contributing to open source projects so that noobs like me can learn what good code is.
*whatever that is.
Friday, September 28, 2012
Wednesday, September 26, 2012
Numbers are weird
I'm working my way through Steve McConnell's excellent Code Complete 2. I'm currently on chapter 12, which covers the basic data types a programmer might encounter. Most of the chapter covers numbers and the errors one might encounter when using them. I naively thought "Aha! I can do numbers!" Ah, if only I had paid attention in Computer Architecture, I would have known computers have a much harder time dealing with numbers than we do.
The main reason why computers have an issue with numbers is that they only have a limited space to put the numbers in. Computers can only use zeros and ones (or "bits") to represent numbers. "Normal" numbers usually only have 32 binary digits to describe them. This gives us a range (in base 1) of -2,147,438,648 to 2,147,483,647 if we use the first digit to determine if the number is positive or negative and 0 to 4,294,967,295 if all 32 bits are dedicated to numbers.
So what's the problem? Those numbers are more than enough to write a shopping list, unless you're buying 5 billion oranges. Even if your shopping habits include ~1/3 of the annual Florida orange production, it's a simple matter to add more bits. Using 64 bits (the next typical setting), you could purchase 18,446,744,073,709,551,615 oranges before having issues. (The volume of said oranges is still 1,000,000x smaller than the Earth's volume).
The issue arises when we try to stuff numbers into places too small for them. Say I wanted 127 oranges. (I'm a college student, scurvy is a very real issue.) Since a signed 8-bit integer can store a number up to 127, I store that in my brain as orangesToBuy (look at all the bits I'm saving!). As I drive to the orange store, my roommate calls me and asks me to pick up an orange for him. I add one to orangesToBuy. When I finally arrive, I walk to the counter and order -128 oranges. What happened?
When I added the fateful orange, the bits flipped from 01111111 (Positive, max number) to 10000000 (negative, minimum number). xkcd illustrates this wonderfully.
When storing numbers, make sure to use the type appropriate for the situation. It's always a better idea to use a too-big type to store it rather than spending hours debugging later on down the line.
The main reason why computers have an issue with numbers is that they only have a limited space to put the numbers in. Computers can only use zeros and ones (or "bits") to represent numbers. "Normal" numbers usually only have 32 binary digits to describe them. This gives us a range (in base 1) of -2,147,438,648 to 2,147,483,647 if we use the first digit to determine if the number is positive or negative and 0 to 4,294,967,295 if all 32 bits are dedicated to numbers.
So what's the problem? Those numbers are more than enough to write a shopping list, unless you're buying 5 billion oranges. Even if your shopping habits include ~1/3 of the annual Florida orange production, it's a simple matter to add more bits. Using 64 bits (the next typical setting), you could purchase 18,446,744,073,709,551,615 oranges before having issues. (The volume of said oranges is still 1,000,000x smaller than the Earth's volume).
The issue arises when we try to stuff numbers into places too small for them. Say I wanted 127 oranges. (I'm a college student, scurvy is a very real issue.) Since a signed 8-bit integer can store a number up to 127, I store that in my brain as orangesToBuy (look at all the bits I'm saving!). As I drive to the orange store, my roommate calls me and asks me to pick up an orange for him. I add one to orangesToBuy. When I finally arrive, I walk to the counter and order -128 oranges. What happened?
When I added the fateful orange, the bits flipped from 01111111 (Positive, max number) to 10000000 (negative, minimum number). xkcd illustrates this wonderfully.
When storing numbers, make sure to use the type appropriate for the situation. It's always a better idea to use a too-big type to store it rather than spending hours debugging later on down the line.
Thursday, September 20, 2012
About Me
I'm Randall Koutnik (It's pronounced "Coat-nick," thankyouverymuch). Near the end of Fall semester 2011, I StumbleUpon'd How to Make Wealth. The idea behind this essay blew me away. Up until then, I was a slacker college student with some vague idea of running my own company someday. As I finished that essay, my life goals began to define themselves.
10 months later, I'm someone different. I've vastly improved as a programmer. I now know much more about many different languages. I'm not stuck in a dead end if Java isn't the best solution (and I've learned that Java usually isn't the best solution). I no longer procrastinate as much and when I do, it's doing things that are useful, rather than looking at cat pictures.
I'm still very aware of how far I've yet to go. I'm now employed as an assistant manager at a small-but-awesome software firm. I don't have my own company. (yet!) I've learned that doing what seems impossible only takes hard work and dedication.
10 months later, I'm someone different. I've vastly improved as a programmer. I now know much more about many different languages. I'm not stuck in a dead end if Java isn't the best solution (and I've learned that Java usually isn't the best solution). I no longer procrastinate as much and when I do, it's doing things that are useful, rather than looking at cat pictures.
I'm still very aware of how far I've yet to go. I'm now employed as an assistant manager at a small-but-awesome software firm. I don't have my own company. (yet!) I've learned that doing what seems impossible only takes hard work and dedication.
What is Recoding?
Recoding is simply a clever name for a blog.
Recoding is going back and fixing that thing you should have done right in the first place.
Recoding is attacking code you wrote six months ago, rewriting it with all the knowledge you've gained.
Recoding is gaining that very knowledge.
Recoding is being handed thousands of lines of terrible code and gleefully diving in, glad for the chance to create a functioning system out of spaghetti.
Recoding is learning new things that you'll never use, just for the experience of doing something you've never done before.
Recoding is re-engineering your entire life, changing habits, friends, and beliefs to achieve what you once thought was only a pipe dream.
Recoding is the story of how Randall Koutnik changed his life from a good-enough slacker to something better.
It's a story about me but it can become your story too. Read on to start your own recoding adventure.
Recoding is going back and fixing that thing you should have done right in the first place.
Recoding is attacking code you wrote six months ago, rewriting it with all the knowledge you've gained.
Recoding is gaining that very knowledge.
Recoding is being handed thousands of lines of terrible code and gleefully diving in, glad for the chance to create a functioning system out of spaghetti.
Recoding is learning new things that you'll never use, just for the experience of doing something you've never done before.
Recoding is re-engineering your entire life, changing habits, friends, and beliefs to achieve what you once thought was only a pipe dream.
Recoding is the story of how Randall Koutnik changed his life from a good-enough slacker to something better.
It's a story about me but it can become your story too. Read on to start your own recoding adventure.
Subscribe to:
Posts (Atom)