Enumerations for the Reduction of Complexity March 13, 2020 by thepigeonfighter Introduction One of the ever present problems faced by developers is the concept of complexity. Complexity is everywhere; it makes things difficult to analyze and even more difficult to control. When writing programs, complexity is introduced exponentially every time a conditional path is added. With every ‘IF’ statement, you are introducing two possible paths your program could take. This article will start first with principles of conditions. I will then examine possible use cases for enumerations. ...
How Code Becomes Binary
How Code Becomes Binary November 10, 2019 by thepigeonfighter Humans have been inventing programming languages since the 1940s. But what exactly is a “programming language” and how do computers understand different programming languages? Well read on to find out. The Basics Before we dive into things we have to break down what code is. Code is a series of instructions given to some sort of CPU. It doesn’t matter if you are using Java, C-based language or something silly like PHP. At the end of the day it all becomes just an instruction (or more likely a series of instructions) that is passed to the CPU to be executed. Let’s take a look at a line of code that is written in JACK which is a high level Java-like language. ...
Stack vs Heap
Stack vs Heap September 17, 2019 by thepigeonfighter Synopsis Stack and Heap refer to different memory segments. In general, the size of the stack will be much smaller than the heap, but the time required to access data on the stack will be much shorter than trying to access data that is stored in the heap. More Info This is an inherently confusing question. Why? Because in computer science there is a data structure called a “Stack” and a data structure called a “Heap.” When people are talking about the stack and the heap and computer memory, they are NOT referring to the data structures, but to memory segments inside the computer. To make it more confusing, the “stack” they are talking about actually employs the “Stack” data structure. For that reason, we will really quickly describe what a “Stack” data structure is before we investigate what people mean when they refer to the stack vs the heap. ...
How Bit Shifting & Bit Masking Work
How Bit Shifting & Bit Masking Work June 1, 2019 by thepigeonfighter Last post we talked about binary on a basic level. If you are not comfortable with reading binary or counting in binary I recommend you check out that post before reading this one. Using bit shifting and bit masking judiciously opens the door for massive performance optimizations in low level hardware and high level systems alike. Read on to find out how. ...
What is Binary?
What is Binary? April 17, 2019 by thepigeonfighter Learning how to understand Binary will change how you see all numbers. What is Binary? I am sure you have had that question at least once or twice before. Usually for me what happened in the past was, I would ask that question and then end up in some article written for a college Computer Science course that would over-explain what is a pretty simple idea. Adding in poor un-intuitive examples to further muddy up the waters. So I want to remove the mystery about it once and for all. Taking your skills from zero to one when it comes to binary. ...
How does your Processor Process?
How does your Processor Process? March 30, 2019 by thepigeonfighter An exploration into how your Processor works. In previous posts I have described how computers store information and how the special ALU chip can run elementary operations on that information. In this post I will endeavor to bring it all together into a cohesive overview of how your processor works. Before I dive into the details, I want to step back and talk about the overall architecture. ...
How Computers Remember
How Computers Remember February 20, 2019 by thepigeonfighter A brief explanation of Computer Memory In the last post I explained how the ALU in your CPU works. As impressive as that is it doesn’t do very much good unless you can store the results. Which brings us to the problem of how can computers store memory. In general there are two types of memory that most computers employ. There is temporary memory and persistent memory. We will be spending our time explaining the temporary memory. Suffice it to say that persistent memory are things that are written to your hard drive or other storage device. Whereas the bulk of temporary memory is stored in your RAM(Random Access Memory). The difference is that when the power is removed from your RAM all the memory is effectively lost. With hard drives on the other hand the data persists even without power. ...
The ALU
The ALU February 3, 2019 by thepigeonfighter The Arithmetic-Logic Unit All of mathematics solved with six bits. The ALU is the work-horse in the CPU. It is responsible for almost all computations. In previous posts I have described building various kinds of chips that have somewhat simple functionality. Each chip was required to have a specific behavior that can be repeated giving the same result every time. The last chip I described which was the Full-Adder was the first chip that had some immediate mathematical utility. Well this is the post where we bring everything together to create the most complicated chip we have made thus far, which will be able to do the following : ...
In Addition to That…
In Addition to That… January 26, 2019 by thepigeonfighter This is where we learn how to do addition in the lowest levels of computer architecture. After completing project one you are left with five, 16-bit chips and no real idea where to go next, or why you made them. Luckily, you don’t have to be as smart as the guys who invented the computer in the first place, and you can always just continue on with the course. In project two you are commissioned to build five more chips. Four of them have to do with addition and the fifth one is the main ALU chip of CPU, which I will talk more about in a later post. One question you might have is what exactly do I mean by 16-bit chips. Well, in previous posts I have talked about the NOT and the MUX chip. Both of those would be considered single-bit chips, because you are feeding them one bit (a zero or one) at a time. So a 16-bit version of those can handle inputs of 16-bits at the same time. Why would you want to do that? Other than the obvious speed benefit when using 16 bits, all of the sudden you have the ability to start representing numbers greater than one. In fact, for an unsigned (only positive) 16-bit number can be as large as 65535, which is awesome because I don’t know about you but I am ready to get back to the real world where 1+1 = 2 despite what Boolean Algebra tries to tell you. But before we ascend to the world of comfort and normalcy that normal arithmetic offers, we must descend back into the weird world of binary. Let me give you a couple examples of how some of these 16-bit chips work. ...
Muxtopia
Muxtopia January 26, 2019 by thepigeonfighter My last post introduced the trivial but necessary “Not” chip. Before I move away from the first project I would like to mention one last chip that I found intriguing and especially difficult to solve. That chip is called a “Multiplexer” chip, or a “Mux” chip for short. This chip has a much more complicated specification than the “Not” chip. How it works The mux chip has three inputs which I will refer to as ‘a’, ‘b’, and ‘sel’ short for selector. The mux chip also has just a singular output. The functionality that we want from this chip is when the ‘sel’ input is 0 the out should equal ‘a’. If ‘sel’ is 1 the output should equal ‘b’. Therefore from that information we can derive the following truth table. ...