Here's a brief list of interesting facts.
The show "Let's Go Boating" episode #109, hosted by Lou McNally and edited by yours truly, began it's life as a digital film when a vast collection of lightwaves hit a lens in a PD 150 digital video camera, and were split into three color recognition computer chips; one for green, one for red, one for blue. This is the natural refraction of light, as we see, though the brain compensates in an interesting way for anything except green. I'm rusty on the details of this, so I'll skip it. The chips sent signals through a series of wires and placed them all on a digital recording medium, which is actually a tape. The tape records a series of digital signals in what is paradoxically a basically an analog method of linear encoding.
What's important is that this is the first time the light waves are compressed, or changed from their original form into DV. The camera makes a digital representation of the analog input, which is called DV compression. DV engineers are the only people in the entire business trying to make terminology easy and consistent.
After that, the raw footage that would someday become LGB 109, as it's labeled in dark closet on 222 St. John's St., was hooked up to a computer cable input device, which transferred the DV video into the computer through a complex digitizing process whereby the DV encoding was sent through another compression that redigitized the separate light values (red green blue; RGB), encoded the values, and composited, or layered them all back into a single video file stored on a G3. It was stored on a G3 because my boss has curious ideas about where money should go, and has not upgraded to G4s yet. Regardless, the DV audio went through a much more traumatizing process, whereby it left the camera in analog form at 44100 khz, which is a digital measurement of how many samples per second are recognized, and put into the computer, where it was redigitized into a digital coding so it could be freely cut up on a computer.
The point of this entire process was to make it possible to move parts of the video around without having to hit rewind or fast forward.
After the data spent a while getting processed and reprocessed and altered in the G3, it had to be taken off the harddrive, and transferred to my computer. To do this, it had to go back to the DV tape, DV video to DV video, audio to analog then back to digital. Then it had to go from that DV into my computer.
Now, my computer uses a program that is "DV native", which means it uses the same compression for basic viewing and capturing that the camera does, so there's no difference in what is contained on the computer versus what is contained on the tape.
Theoretically. In a minute we'll be getting to the point, which is where this gets called into question.
The file then went off my computer, via DV, back to another computer, and was finally printed onto to a Beta tape.
Which is analog.
The beta tape was then sent to a television studio where it was immedietly put into another computer system, back to digital, which sends signals straight to the transmitters, which turn the signal into an analog wave for reception on TV's which receive the analog wave and send it to the tube, which translate the wave into an analog picture. Or it would be analog, except that TV screens are pixelated, or digital representation of analog.
The point is that the digital revolution, though handy in speeding up how the world moves and breathes, has hyped itself on a whole lot of bullshit.
One of the tenants of digital media is that there is no signal loss in digital transfer. As you can see from above, the whole transferring process is a mix of digital and analog cables and wires and tapes, and even if the process was purely a DV based system, the analog nature of the tape makes it a philosophically curious mix of the two forms of data. Furthermore, files corrupt, and alter themselves in freakish and irrevocable ways. This may seem like a nit-picking point, and you may say that computer errors cannot be compared to analog errors, but this is only because we have a popular notion of computers as being inherently controlled systems, and a counter-pop vision of nature as being inherently uncontrolled. The deterioration of an analog signal is the inevitable decay of a system; so are all computer or digital data errors. Both systems are things that we recognize as information because of certain symmetries within the system that relate them to previously categorized orders and symmetries. The breakdown of either system is ultimately the same process, the fact that computers are systems designed by computer engineers to work in particular ways in no way makes a computer system inherently different from a tidal pool. One crashes, the other dries up, in the meantime, they function as limited, but not closed systems.
By the same token, people latched onto the notion that still and cinematic film has a mysterious quality that digital video will never reproduce. It does have a quality: grain. All the depth of color and subtlety of shadow is based on the fact that 35mm film has a finer grain than we can efficiently reproduce digitally, though HD technology is very close. Film grain is far from infinite, it is based on the grain of the substance that reacts to light within the film. Now, it seems infinitely subtle to us, because the grain in this film is actually finer that the grain inherent to the human eye. Yep. The resolution of our eyes is limited to the rod and cone system that translates the light waves from the outside world into brainspeak so we can create the internal image of light, dark, red, green, and blue. Digital resolution doesn't even need to catch up to film, it only needs to beat us.
The more terrifying misconception about digital representation is that, once it's captured, it is somehow unchanging, and infinitely duplicatable, in its exact, originally digitized form, which is what has brought up all these odd dime philosophy moments in TV shows and movies about identity. See: Matrix.
The question is, if the digital representation of data catches up to the universes inherent digital level, will we lose senses of identity in things? Will the physical, so called analog sense of the world cease to have any meaning once we can create digital realities that are so detailed, they are beyond our senses capacity to register the difference between it and reality?
So I was watching my computer digitize from the screen to the camera. I was seeing the same video on the camera screen that was on the computer screen, and I knew that exactly the same sequence of 1s and 0s were going to be on both pretty soon. Put aside the natural occurance of errors in transfer and recording, and assume that the two sequences are PRECISELY the same.
I noticed, not for the first time, that I could see the mouse pointer on the computer screen. I usually move the pointer, because I have a neurotic and totally unfounded fear that the movement of the pointer might show up on the camera screen, and thus the tape. But this time, on my way to the mouse, I looked at the camera screen, and noticed the timecode zipping by, next to the "time left" clock, and the "REC" letters in the corner. None of these show up on the finished video either, and I noticed, for the first time, that I was seeing two completely different things.
It then occurred to me that the two binary sequences were on completely different mediums, and were actually completely different entities unto themselves, and furthermore, that sequence would never once for the rest of its existence be played out by exactly the same medium. Even if I was to put it out to another tape, it would be in and of itself something completely different. But even more abstractly, the pointer on my screen represented the nature of MY experience of the data through one source, while the timecode on the camera represented a different experience of what I previously supposed to be the same thing. When I watch it tomorrow, on an analog tape through a TV monitor, it will be something else entirely.
So here are the three questions:
Is there a difference between seeing two separate digital representations of a thing?
Is there a difference between seeing a digital and an analog representation of the same source?
Is there a difference between seeing another damn boat in real life and seeing the same damn boat on a TV monitor?
The answer to all three is absolutely YES, but what's important is that all three of these are the same question when looked at from the paradigm of trying to understand the nature of experience. They are also similar to asking if there's a difference between seeing a picture at four o'clock in the afternoon and seeing the same picture at five o'clock. Yes. Of course. Because the nature of our experience is based on things that make all these experiences fundementally different; they are only related.
So questions about identity and reality in the digital landscape are actually the same questions about understanding ourselves and the nature of experience. The same questions we've been asking for thousands of years. Furthermore, digital VS analog is a myth; this is simply a popular feature of the question of duality versus wholeness in a finite understanding of an infinite universe. The digital revolution carries no significant moral or philosophical burdens with it; it is about the same as the industrial revolution, and much less world changing than the agricultural revolution. Digital technology's only major effect has been to make it easier for HUMANS to process and distribute information, so if you look at it like that, digital technology, far from being the downfall of existence, is actually only an innate expression of how we cope with our environment. The "unnaturalness" of technology is a coarse way to insult the "human-ness" of our nature.