I think this is pretty much the last article I’ve written for PCPlus that discusses algorithms in a fairly formal sense. As I said last time, my editors and I have slowly been moving my articles towards more “how it works” topics than the traditional “layman’s guide to algorithms” subjects I’m perhaps better known for.
This one was essentially a discussion of big-Oh notation and how we can use it to categorize algorithms in terms of their overall performance and memory usage. To illustrate the notation, I used the abstract data structure known as an associative array, or dictionary, implementing it in several different ways, and showed each efficiency in terms of big-Oh. I not only discussed the average big-Oh values, but also the best-case and worst-case scenarios.
The implementations I used were unsorted arrays, sorted arrays (and binary search), hash tables, balanced binary search trees, and I briefly touched on radix trees and ternary trees.
Not too bad an introduction to big-Oh, if I say so myself.
This article first appeared in issue 293, April 2010.
You can read the PDF here.
(I write a monthly column for PCPlus, a computer news-views-n-reviews magazine in the UK (actually there are 13 issues a year — there's an Xmas issue as well — so it's a bit more than monthly). The column is called Theory Workshop and appears in the Make It section of the magazine. When I signed up, my editor and the magazine were gracious enough to allow me to reprint the articles here after say a year or so.)
Now playing:
dZihan & Kamien - Stiff Jazz
(from Gran Riserva)
No Responses
Feel free to add a comment...
Leave a response
Note: some MarkDown is allowed, but HTML is not. Expand to show what's available.
_emphasis_
**strong**
[text](url)
`IEnumerable`
* an item
1. an item
> Now is the time...
Preview of response