Posted by: movieotaku | April 6, 2007

Computing 101

I grew up with computers. My father was a mainframe COBOL programmer. I cranked out hundreds of lines of BASIC by the time I was 12. I knew DOS inside-out before most people had ever heard of Bill Gates. This knowledge was a great benefit in every area of my life but one: watching movies and television.

Since putting this site up, several readers have suggested I add computer blunders to this page. I decided to take it one further and create a Computing Science According to Hollywood. I’m going to look at computers the way Hollywood sees them and compare it with reality. It’s an interesting exercise because you can see how the general public views computers and what they expect from them.

User Interfaces

The Stone Age

Some of the earliest portrayals of computers involved big rooms filled with row upon row of metal cabinets, or sometimes just a wall. There would always be spinning tape spools, flashing lights, toggle switches, buttons and dials. The scientist would program the computer using these controls on the front panel. The machine would whir and click away for a few seconds before returning an answer in lights on the front display panel. Sometimes, you would get a mechanical voice returning a result or a teletype printout. In the old Batman series from the 1960’s, Batman always got a punched card from the Bat Computer.


In the very early days of computers, this was partly accurate. You operated computers through those switches and dials, but you programmed computers using punched cards or a tray of beads. What you didn’t do was getting it working in minutes. Writing a computer program took a lot of work because you could only program them using the machine’s native code via way of an assembly language. Assembly language is avery primitive programming language that can only do simple math, compare numbers and move data around a machine word at a time. It would take a programmer days to design, code-up, hand test and submit his program. Then, if you were lucky, it worked right the first time. Things got a little bit better when compilers became available, but it still took time. You couldn’t just twist a dial and punch some buttons to get an answer.

Once you got your program running, the most common way to get your result back was through a teletype machine: essentially an electric typewriter connected to the computer. The tradition of flashing lights comes from ENIAC, America’s first electronic computer. The engineers wanted to show off ENIAC to the press, but when ENIAC was running, there was no indication anything was happening. Someone suggested wiring up a bank of light bulbs to ENIAC’s accumulator so it would show the current working total as it added up a list of numbers. Unfortunately, everyone assumed computers had flashing lights to show it working. This tradition is still carried on by Connection Machines CM5 supercomputer. It has a huge bank of red LED’s on the side that provides a pretty pattern of lights as it operates.

Voice Activated Computing

In 1967, Gene Roddenberry created a series called Star Trek. He took the time to talk to various experts about what the future would be like. The guys at Sperry-Rand, who gave us UNIVAC, said we would talk to computers in the future. They would understand human speech and program themselves (in other words, artificial intelligence). It worked great for television because it prevented long, dreary scenes of Spock turning dials, flicking switches and pressing buttons. Now dramatic moments could come from talking with the computer. They were so powerful that it became the standard computer interface for TV and movies until the 1980’s.

The opinion on voice operated computers has swung backand forth over time. Today, voice-operated computing has come back in vogue because mobile phones and palm computers offer more features, but they don’t have keyboards. The only problem has been recognizing the human voice. It’s turned out to be harder than anyone thought. People have so many accents, personal pronunciations and odd grammar idiosyncracies that writing the software to tell the difference between “order sandwiches” and “order sand witches” has become the hardest part. Luckily, we have the computing power to do it now.

The Rise of the Personal Computer and DOS Prompt

apple_2.jpg

In the early 1970’s, a hobbyist realized he could make money selling a self-assembled computer kit. Called the Altair, and based on an early Intel microprocessor, he advertised it in Radio Electronics magazine and soon struggled to keep up with demand. It did require people to program it by flipping switches, turning dials and pressing buttons on the front panel, but I think that was part of the charm. A couple hobbyists, Steve Jobs and Steve Wozniak, designed a new personal computer: the Apple I. It replaced the switch flipping and lights with a video terminal and keyboard.

Later, the Apple II was released. It could operate a disk drive, but needed a special program, a disk operating system (DOS) called APPLEDOS. When IBM got into the PC game, they asked Microsoft to provide a DOS for it: PC-DOS.

DOS used a difficult to master command language. To get a listing of the files on a disk, you typed ‘DIR’ (MS-DOS) or ‘CATALOG’ (APPLEDOS). Printing a file required a command like:

copy myfile1.txt lpt1:

A lot of people who had to use DOS found it difficult to get used to and really did not like it. When Hollywood tried to depict using a computer, they usually had someone type in a plain english command like:

print the file

Print the file? Which file? Which printer? This kind of blunder quickly told you who had used a computer and who had only heard about PC’s on the news.

There were precious few good examples of PC usage, but one TV show that did a better job than most others was a CBS show called Whiz Kids. The show was written like they knew what they were talking about (See the Hacking section for more details).

Unfortunately, Whiz Kids was an exception. Hollywood continued to give us jumbled jargon and usage until 1984 when Steve Jobs unleashed his next great thing.

Graphical User Interfaces

When the Macintosh was released, there was finally a computer Hollywood could cope with. It’s easy to use mouse/windows interface quickly became part of the culture. Who could forget Scotty’s famous “Computer, computer?” from Star Trek IV: The Voyage Home?

Most of the time, Hollywood managed to get the gist of graphical user interfaces (GUI). They tended to use a custom designed look instead of an off-the-shelf package. This is understandable because the movie and TV GUI’s had more animation and a prettier aesthetic look. Most blunders involved extraordinary capabilities (see the Capabilities section) or mis-using an off-the-shelf package (using one off-the-shelf application to represent something else). An interesting example of that was in Jurassic Park.

In Jurassic Park, the young girl figures out how to reboot the computer system by sitting down at it, seeing a really nifty 3-D interface, and exclaiming, “It’s UNIX! I know that.” First off, if she wasn’t so young, I’d marry her. Any woman who knows and likes UNIX is OK in my books. Secondly, that line made everyone who knew UNIX burst out laughing. UNIX is a notorious command-line prompt like DOS, but with even more computer-like commands (‘ls’ instead of ‘dir’, ‘rm’ instead of ‘del’).

The interface she was using is called ‘fsn’ and is a demonstration from Silicon Graphics, Inc. to show off their 3-D workstations. It makes sense considering Silicon Graphics was a major product placement sponsor of the movie.

Future Interfaces

Voice recognition has made a big come back and is the basis of several computer interfaces including IBM’s wearable PC. There is research into direct connections to our nervous system and brain so we can just think what we want the computer to do. There are some nifty experiments with alternative user interface devices, like Virtual Reality gloves, and computers that can read our gestures and eye movements, but they’re still in the lab. Whatever happens, I still think it would be a good idea to keep your keyboarding skills up.

Virtual Reality

What We’ve Seen

William Gibson wrote a classic cyberpunk novel called Neuromancer in the early 1980’s. It described a near future society where computer technology was ubiquitous. Society schismed into the haves and have-nots, and directly jacking your brain into a computer was commonplace. It got a lot of people excited, including computer scientists who were working on a fringe technology called virtual reality.

Ironically, William Gibson was totally computer illiterate. He wrote the novel on his kitchen table with a manual typewriter; he didn’t even own a computer when he wrote it. Everything he wrote was absorbed from the media or invented from his own imagination. The media took William Gibson’s ideas and parroted back a version of virtual reality which has little to do with reality, virtual or otherwise.

The media version of virtual reality, or cyberspace, shows a rich, colorful, 3-D environment that you can cruise around in. You just plug into a wall socket or computer and you are there. Virtual reality is richly featured, detailed and the laws of physics are complete. Everything is simulated down to nitty-gritty details. In the movies, you can’t tell the difference between reality and virtual reality. According to TV, if you die in virtual reality, you die in real life. People can supposedly blow up your head through virtual reality.

It’s ironic that the original virtual reality, movies and television, got caught up in the virtual reality hype. You heard predictions about “the end of civilization” as people would jack into virtual reality and never come back. Virtual reality would change the world and nothing would ever be the same again. All this would happen within 20 years, the futurists proclaimed.

What We’ve Got

It’s the year 2001, and we’ve got our space station, but the only virtual reality I see these days are the first-person shooter games. Cyberspace has come true: a virtual world inside the computer where everyone will spend a part of their lives. We just call it the Internet.

Just about every PC now has video cards capable of generating rich 3-D worlds that would have required a supercomputer just 20 years ago. But what do we do with it? Play a game that lets us blow away our co-workers in a safe, controlled environment. You have to ask: what happened?

Just 10 years ago, VR experts claimed the biggest impediment was the hardware. They said we needed faster computers with more memory. If we got faster computers, everyone could enjoy VR. My computer is now powerful enough to run a simulation of a special ops anti-terrorist team with free run of a detailed world with a dozen or more AI characters to fight, and my computer was behind the curve when I bought it. Today’s computers can create some stunningly beautiful worlds, so we have the hardware, but Gibson’s cyberspace is still just a fantasy.

2008: First person shooters are getting waay more realistic.  With talk of real-time ray tracing and graphics processing units (GPUs), virtual reality is real, but we don’t use the headsets.   World of Warcraft is the closest we’ve come to Cyberspace.

The problem is embarrassingly simple: bandwidth. As I write this, there are virtual immersion projects being developed in the United States. As part of the Internet2 project, researchers are creating a teleconferencing system that will enable people to talk to each other like they were in the same room. The campus network staff know when a virtual immersion test is under way because their routers suddenly light up like Christmas trees; their gigabit capacity maxed out by the project.

In the researchers’ offices, they stare grinning at each other wearing 3-D glasses. Their simulation is only running at 10 frames-per-second. Your TV runs at 60 half-frames per second. Your child’s Pokemon game runs at 30 full-frames per second. The researchers say it will get better as the Internet2 project begins to push past the gigabit rates currently available, but the problem still remains: this is a big-money research project for industry leaders and academia. This is not something the average user will get to use for 20 years or more.

In order to get the detail and realism of TV and movie virtual reality, you need a lot of bandwidth. Your 56K modem and your cable modem tied together couldn’t even get you a frame per second. The futurists and VR-evangelists of 10 years ago said bandwidth wouldn’t be a problem because we’d all have fiber optic running to our homes giving us gigabit connectivity. As I’m sure you realize, almost no one has fiber optic to their door yet. The last mile has proven to be an almost insurmountable problem. If you do get fiber optic to your house, you’ll be paying thousands of dollars per month for mere megabit capacity. (2008: Thank you, Verizon, for making a liar out of me)

The other problem is the Internet backbone. Some people believe the Internet backbone will reach a saturation point that will become impossible to fix without a massive technology upgrade to the newest TCP/IP protocol. The Internet2 project was created to address these issues, but they’re predicting rollout could be 20 years or more down the road. The technology for gigabit capacity is still bleeding-edge and very expensive. Until we have the technology to handle a multi-gigabit global Internet, I don’t think we’re going to be getting virtual reality any time soon.

2008: The Internet backbone is still an issue, but it has survived YouTube, P2P and BitTorrent.  And now, the gold rush to put HDTV on the Net.  Somehow, the backbone has survived.

Capabilities

William Gibson said he was disappointed when he got his first computer. He found it couldn’t do all the things he thought it could. This is one of the more subtle inaccuracies in the media. On Star Trek: The Next Generation, Data or Dr. Crusher would ask the computer to perform a complex calculation or analysis. Captain Picard will ask the computer a very subtle and complex question and get back the answer fairly quickly. In 2001: A Space Odyssey, HAL can carry on a conversation and pass a Turing Test without blinking an unblinking eye. We’ve seen computers do amazing things with amazing speeds, information and problem solving. If computers could do everything they did in movies and television, my life would be a lot easier. I could just ask my computer to research all the references I need for this essay!

I think the problem is society views the things computers do with awe and amazement and can’t wrap their minds around the idea that some things are very hard for a computer, especially the kinds of things we take for granted like seeing.

Computer Analyses Made Easy

The one thing I see in television and movies that I wish was true is computer analysis. In real life, if I wanted to analyze a large database, I would have to spend several days (sometimes weeks) writing a computer program to analyze the data. I have to figure out everything for the computer: how to fetch the data, how to compute the analysis and how to present the data back to me. This is the current state of the art. There are increasingly applications that makes things easier, but they are little more than summarizers. If you want to do any real, complex analysis, you have to start writing code.

In Star Trek: The Next Generation, Dr. Crusher says it will take her three hours to program the computer to take every star in the galaxy, figure out their relative locations several billion years ago and then find a pattern scattered amongst those stars. Man, what I wouldn’t give for a computer that could do that, but there are several implicit assumptions about computer capabilities in that scene. One is that you can just talk to the computer and the computer will figure out what you want. We are still a decade or more away from that. Computers know nothing. They can only do amazing things if some human tells it explicitly how to do it. You want to know what your computer really is? Take out a hand calculator. That’s your computer. It can make a few simple decisions (is this number less than this number?) and take different steps accordingly, but that’s it. It knows nothing of galaxies, orbital mechanics or mathematical analyses. A human must still program the computer to get it to do these things.

Know It All’s

Another assumption is omniscience: computers know everything. Every scrap of information has been digitized and stored in the computer in a format it can easily search, analyze, manipulate and use. In Babylon 5, John Sheridan wants to find out some information so he asks his second-in-command to do a computer search to find out about someone who lived at a specific address in London, England sometime during the 19th century with the name Sebastian (“last or first”). It takes less than a day, mostly the communication lag time. Did Earth digitize every scrap of paper humanity has ever made into a computer, converted it into a searchable document and made it available on a global network that anyone can access? That would be very cool.

In Star Trek: Voyager, it has been mentioned several times that the Voyager’s computers contain the entire knowledge base of the Federation. Every major work of art, entire histories, holodeck programs. The whole ball of wax. This either says something about the storage capacities of the U.S.S. Voyager or the sum knowledge of the Federation.

In the real world as this time, very little information is available digitally, especially if it is old. If it is available digitally, I can almost guarantee you it’s not in a format a computer can easily read and analyze for itself. This may change, but not for a long time. So if you see someone in the 20th century performing an exhaustive computer search without having to go to the library, you know you’re dealing with fantasy.

2008: I have tossed about half my paper reference collection because it’s all available on the Internet.  Wikipedia is getting scary with how much information it has.  Projects like Project Guttenberg and Google Scholar are trying to scan every book in existence and put it on-line.  The one, searchable Internet may become real!

Artificial Intelligence

The final assumption is intelligence: the computer has an artificial intelligence that has been taught a great many things, especially about mathematics, science and astronomy. It can carry on and follow a conversation. That’s pretty good considering the Microsoft Word grammar checker can’t seem to tell the difference between a verb and a noun.

This might not be as far fetched as it seems. There is a lot of work going on to make computers voice activated. With the advent of mobile phones which don’t have keyboards, companies are suddenly interested in this neglected field. But how far can they go? Can they create a computer that can completely understand a human being and follow a conversation? Just from my gut feeling for the matter, I’d say yes, and we may see some surprisingly good applications of this soon, but I don’t think they could formulate an analysis on their own for quite some times. I don’t believe we’ll be able to create a general purpose AI that can create its own algorithms in the near future. If you think about the kind of education and experience a person needs to take a request to “scan a planet for the capitol city from orbit”, you can imagine the programming challenge. I don’t believe it’s impossible; just not likely to be done anytime soon.

2008: Tim Berners-Lee has been promoting the semantic web as a solution to this.  With the semantic web, people creating content add tags to their text to help computers understand what the text is without having to understand all the English text on the page.  Although some doubt if the semantic web can work or is even needed.  The more interesting thing is the evolution of natural language querying.  Google’s AI guru says they’re not going to do it.  But others are trying their best to make this come true.  It’s still pretty primitive, but this might change soon.

Networking

The Basics of Computer Networking

The idea of computer networking is so important these days that the New Economy is based almost entirely on it. The basic idea is that two separate computers can connect by a wire or radio transceiver and exchange data. Simple enough concept. Now this part Hollywood gets right because it’s so hard to screw up. The one thing that seems a little odd is the idea that a computer could control something electrical without a special device. You’ve seen the TV movie: computer AI goes nuts, attacks the hero with all sorts of mechanical devices. How does the AI get access to those things? Especially in the 1970’s and early 1980’s. There was a movie called Electric Dreams, which looked like someone talked to a technical consultant. The computer in that movie could control all those devices because Miles, the hapless hero, went around plugging all his appliances into them.

The Internet: Hype and Reality

The Internet is generally well depicted in the media. This is probably thanks to the high penetration the Internet has into mainstream society. Even e-mail is generally depicted well with the exception that movies and TV shows show an E-mail system unlike anything anyone has ever seen. Strangely though, this might get better as AOL has become a popular e-mail client for screenwriters (I think they like that “You’ve got mail!” voice clip).

One point of distinction: TV and movie confuse the Internet and World-Wide Web as the same thing. That’s like confusing the telephone and the fax machine. The Internet is the network and any two computers can transfer any type of data they want between each other using any format they want. The World-Wide Web is a specific application of the Internet. People used to use the Internet for a lot more than the World-Wide Web, but unfortunately, most people are unaware that the Internet has more services than that.

2008: Ha! People have discovered that BitTorrent is another Internet service.

Hacking

Tell me where you’ve seen this: a geek wearing thick-rimmed glasses hunched over a keyboard typing away furiously spouting gobbledygook as he hacks into some computer system. How many times have they ever explained what they were doing? Very few, but when they get in, fancy graphics suddenly appear on the screen. Unfortunately, that’s not what hacking is like. It’s dealing with a plain text terminal that rarely gives you anything special when you log in. The only time that happens is if the system administrator has a sense of humor and has a collection of ASCII art to draw on.

Another thing: hackers have tools, special software that will try common passwords, search remote computer systems for backdoors, etc. You rarely see, or hear about, these tools, but they’re the life blood of a hacker. The one show I know that showed what hackers can really be like was the CBS show Whiz Kids. The show about a geek, his homebrewed computer, his friends and the trouble they get into and out of. They showed a proper password phreaking and even talked about legitimate countermeasures. For example, in one episode, the reporter asks the kid to hack into this system. The hacker runs his password phreak, which tries every combination of letters to find the password, and discovers that rather than give the user infinite retries, will hang up after the first incorrect password. Simple, but surprisingly effective.

Another problem is there’s a new image of hackers as really cool looking teenagers. Some of them are, some of then aren’t. Some of them are jocks, some of them are prototypical geeks. Some are on the honor roll, others are misfits. The hacker culture embraces a huge diversity of humanity. In fact, I find the hacker culture more diverse than most sub-cultural groups. The one bias seems to be the lack of females, but that may have more to do with women not being as interested in hacking.

2008: A more disturbing trend is the rise of the professional for-hire cracker.  These are hackers who penetrate systems for profit, and it’s getting bad out there.

Interoperability

Connecting hardware together

In Red Planet, Val Kilmer was able to rip apart the Mars Sojourner rover and hack together a solar powered digital voice modem. Later, he was able to hook this hack into an almost as old Russian sample-return probe and control it. Today, I’ve been struggling with my digitizing tablet driver crashing on boot up. In the movies and television, connecting computer hardware together seems so easy and effortless. It seems to be just a matter of plugging in most times, or at worst a quickly spliced-together cable. I work with computers day in and day out, and let me tell you: getting computer hardware to work together is hard.

Hacking hardware

Now, if systems were meant to connect, like in Aliens when the Marines were able to connect to the terraforming station’s main computer, it is entirely believable they will work together instantly. In fact, it will probably get better as time goes by, but I suspect there will still be glitches. And the idea of Space Marines being ready to connect to any computer in human space easily is well within the bounds of reality.

Hacking hardware of different ages, makes, or even alien species, is harder to believe. Computers usually have finicky requirements and most computer components are intolerant of different voltages, pin layouts, data rates, etc. As time goes by, computer interface standards change and it becomes nearly impossible to get two different interfaces to talk to each other. For example, I helped a friend of a friend set her printer up with her new computer. Her printer had a half-sized centronics port while the computer had only USB. You cannot directly connect USB and centronics ports together. USB has a complicated signal format that requires quite a bit of intelligence on the peripheral’s part. The Centronics interface doesn’t require any intelligence on the printer; all it does is transfer data back and forth. After a lot of running around, I was finally able to find all the requisite cables and, most importantly of all, a Centronics-to-USB connector which had special electronics to add the intelligence needed for USB. Somehow, I doubt Geordi LaForge could figure all that out in an hour or two having to custom make a lot of parts.

2008: Connection hell is still with us.  Example: You can’t connect a SATA-II only drive to a SATA-I motherboard, and guess how hard it is to find a compatible SATA-I drive?  I almost bought a new computer than go through the headache.  I’ve still had connection problems with wireless networks.  Unless you’re using the lowest common denominator standard, you can spend a couple hours getting a lap top to wireless connect to your brand-new wireless gateway if you chose different vendors.

Software Interoperability

Another subtle issue is software interoperability. That’s when one program can directly connect to another. If we’re talking about a Federation starship, that is probably true, but I’m a little skeptical it’s that easy when it’s talking to an alien device. Which brings me to my next point.

Alien Computer Systems

In Independence Day, Jeff Goldblum is able to reverse engineer and write a computer virus for the alien’s computer systems. Now remember, they only had a day. The government did have the captured alien ship for nearly 50 years, but if you remember, they said they couldn’t even turn it on until the alien ships arrived. Now, if you think how hard it is to reverse engineer a computer made by humans, imagine the difficulty with an alien system. You have to figure out how the basis of the computer’s architecture, how it operates, what language they program their computers in, how their software is written, what it does. The list goes on and on. Why, in SF, can a hero figure out how to use an alien’s computer when it takes the best engineers today a month to get up to speed on a new computer with documentation?

Communications

In the Star Trek universe, I think there must be a standards body every technologically capable civilization subscribes to. No matter what alien race they encounter, they can share audio-video information. Audio I can believe because there’s basically two types of audio transmission types: AM and FM. Video is more complex. Heck, even today, we have PAL, NTSC, SEECAM; you need special hardware to decode it all. Now this is believable if the spaceship has been equipped with software to dynamically convert, but what happens if you meet a previously unencountered race? I guess the software must figure out by itself what the video signal format should look like (see the Artificial Intelligence discussion), but is there enough information to figure it out?

Humans see color thanks to red, green and blue sensors in our eyes (thus computers and television use red, green and blue to assemble the spectrum of colors), but that’s not the only way to see things. You can have ultraviolet, infrared or a completely different set of basic color receptors (from as little as 2 to as many as 5 are theoretically possible). How do you know what the transmission is using if you’ve never even studied the race’s biology? Also, would they use a left-to-right, bottom-to-top scanning pattern? What if their technology incorporates image compression?

Obviously, none of these considerations matter in media SF because when the Enterprise encounters an alien race, they can always share audio and video immediately.

To The Future…

SF TV and movie writers/producers/directors seem to have a magical view of technology. They believe it can do miraculous feats of problem solving, intelligence and capabilities. Modern computing isn’t anywhere near as advanced, but is it really a problem to have such a disparity? When Jules Verne wrote Voyage to the Moon, everyone thought it was fantasy: no one could travel to the Moon, the experts thought. Cut to almost a hundred years later, the engineers who created the rockets that took man to the moon credited Jules Verne for their inspiration. Wilbur and Orville Wright were inspired by Daedelus and Icarus. All throughout human history, hardheaded engineers have pointed to the power of fantasy and dreams for the inspiration to achieve more. Maybe it’s not so bad to have a overestimation of the computer’s capability because there is one truth to be learned from the Neuromancer legacy. When the new wave of Internet and VR pioneers entered the field in the 1980’s and 1990’s, almost all of them came with dreams of never-before seen uses for computers and networking and a copy of Neuromancer in their back pocket to guide them.

Sometimes, you need someone to dare to imagine the unimaginable to make it happen.


Responses

  1. I think it may be time to update this article. For instance “Know It All’s”, with projects like Guttenburg, Google Books, our literature is becoming available on line.

    Even know, with a specific address in London, and a name, it would be possible to find out more details about this 19th century person, just ask any geologist. Most BDM certificates are coming on-line, there are military records that are available, and also census information (Just ask my Mum, she could spend hours telling you about it).

    What is stopping information being digital isn’t resources, it Copyright and DRM. How would it be if Captain Picard had to fork over $100 of dollars every time he asked the computer to do research for him?

  2. I think it may be time to update this article. For instance “Know It All’s”, with projects like Guttenburg, Google Books, our literature is becoming available on line.

    Good point. At the time I wrote it several years back, it was seeming pretty unlikely. But since then I’ve heard of a lot of projects going on-line, especially archives and even old census records.

  3. “If you do get fiber optic to your house, you’ll be paying thousands of dollars per month for mere megabit capacity.”

    Might need to update this stuff as well.

  4. I don’t know if this is exactly in the same vein of your article, but in a way I consider it a blunder of movie technology. So my question is this: Why does it seem like there are a lot of movies where the text interface shows lines of text appearing one letter at a time, whether it is fast or slow…That would imply that every key is its own “Enter Key” Does this date back to an actual system that was made to do it? Is just to look cool?

    Notable uses of this are in Alien and The Matrix, there are many others, I’m sure I’ve seen it dozens of times, and it’s always bugged me. Maybe you can enlighten me.


Leave a comment

Categories