This page is about computers. If computers don't interest you, then stop reading now. Computers are part of my work, which probably makes me a bit of an "expert", whatever that means. I have an ability to make computers behave just by looking at them, which can be quite embarassing for my colleagues.
Ferranti Mark 1 |
In the "early days" (1950s), electronic computers occupied whole buildings, and talked to each other by punching holes in bits of cardboard. The history of calculating machines goes back much further, as explained by the American IEEE Computer Society. Developments in computing in the UK were just as important as those across the Atlantic. Details may be seen at the Virtual Museum of Manchester Computing. The image on the left is from the sales brochure for the Ferranti Mark 1, the world's first general-purpose commercial computer, which was available between 1951 and 1958. |
Apple II desktop computer |
Hand-held calculators first appeared in the early 1970s. The Apple II was introduced in 1977, which included a colour screen, sound and graphics. The idea of a desk-top computer was considered gimmicky until 1979, when Dan Bricklin produced "VisiCalc", the first spreadsheet for the Apple II. During the 1980s, fierce competition for the desktop computer market brought about many changes: hard-drives were added, prices tumbled, windowing was introduced, processor power soared. The introduction in 1985 of PostScript (again for the Apple) led to the idea of desk-top publishing, another killer app for the desktop computer.
|
Palm-Pilot handheld computer |
Portable and mobile computing took off in the 1990s. Computers the size of video-cassettes became available. With the advent of graphical Internet access in 1991, worldwide electronic communications became available to the masses. By the late 1990s, computing power previously unimaginable could be carried in a shirt pocket or worn as a wristwatch. Mobile connectivity permitted users to send messages to each other even when on the move. |
Please note: I do not sell computers for a living.
In 1993, the BBC Micro I had used as a student was already obsolete. I shelled out about £1000 for a top-of-the-range "486", which was obsolete within a year. I have never purchased a new computer for myself since.
Rather than buying new, I aim to spend about £200 every two years upgrading parts. I enjoy fun putting the bits together. All the computers I use at home were put together in this way from pieces discarded by my employer, salvaged from other machines, or purchased new or second hand. It doesn't matter to me that the parts are five years old if they do the right job. But once they get beyond about five years old, the parts are often suitable only for "museum" use.
My main computer-related interest computer graphics.. For this purpose, the fastest processor ever made is not fast enough, so I make do with what I have. The other uses for which my computer gets switched on are email and general office work. For these purposes, the machine that was discarded three years ago is more than adequate. A faster processor isn't going to improve my internet connection speed, nor will it make me type any faster.
People ask me what I do with computers. You could watch me sit in front of a computer all day and hardly move. I have variously done the following, which are all true, and probably say more about me than I care to imagine.
Some thoughts on the future of computer technology. (August 2001)
It is curious that computer manufacturers have convinced their customers that the fastest and biggest features are essential, even though they have just become available. Fast machines have their place in the arenas of computer-graphics, software development and games, Manufacturers will continue to get the general public to part with a few month's wages for a machine which will be obsolete within a few months, and which will spend most of that time doing nothing.
For a good number of years yet, computers will continue to get faster and faster, despite scientists telling us the physical limits have been reached. (engineers just find other ways around the problem). The advertised speed of computers will continue to increase (Moore's law states it doubles every 18 months) but the perceived performance will remain the same because of the ever-greater bulk of un-necessary "features" in commercial software. Because of this, I recommend that you should buy the lowest spec computer you can possibly find. Until software vendors write software that does what we want it to do (and not all those other things that we never realised we needed to do as well), this situation won't change. So the way to beat the system is to buy bottom-of-the range new kit.
The three-dimensional display will soon make its debut. The amount of data necessary to store a single 3D still-"image" is easily measured in gigabytes (work it out!) When 3D movies appear, a five-minute film will not fit onto a stack of DVDs, so a new data storage medium with a correspondingly silly acronym will be required.
The idea of a working-wall has also yet to materialise. This concept permits several people to work independently on the same graphical space (most likely a large wall, like a whiteboard). Each member may have several documents on the wall, and any of these may be transferred to a different member for review or modification.
In a networked environment, I will be able to start doing something on one computer, and, if that machine is abruptly switched off or has a fault, I can resume whatever I was doing a moment later on another machine anywhere in the building, even if someone else is using it at the time. (The technology to do this exists today, but not with the popular operating system chosen by most businesses).
One of the biggest hindrances in computing today is usability. Despite the "advances" of graphical operating systems, a lot of the people whom I have met find it difficult to use a computer! Voice control may help, but if designed as badly as graphical interfaces, it has the potential to make it worse. The usability issue permeates all aspects of computing, and much work needs to be done to get the computer to look after itself. This behaviour should certainly be seen in systems administration and software development, though other disciplines will benefit. too.
Computer games are one of the few areas in computing that genuinely push the boundaries of the technology, in terms of hardware, mathematical technique and usability. It is sad to think that most people playing the games have no idea of the complexity behind their experience. They appear to have lots of money though. It is also sad that some of the playability of the early computer games (arkanoid, pacman etc.) has been lost in favour of violence. Some more originality would be refreshing.
So there you have it. Now you know exactly what I do all day, and you can bluff your way in computing. Yeah Right. Maybe it's time to get some fresh air and take the camera out...
Return to Start
Copyright © 1994-2002 Justin Watkins
If you are visually restricted, I would particularly
value your feedback on your viewing experience of the graphics.