Marcos Sueiro Bal is the Senior Archivist at New York Public Radio.
What keeps audio archivists up at night?
Wednesday, November 24, 2010 - 05:15 PM
Senior Archivist Marcos Sueiro Bal recently attended the annual conference of IASA, the International Association of Sound and Audiovisual Archives, in Philadelphia. He reports on some of the highlight presentations (besides being enveloped by the sounds of the world’s largest pipe organ).
Brecht Declercq | VRT | BELGIUM Large Scale DAT-To-File Ingest and Annotation of Radio Programmes: The Path Chosen at Flemish Public Broadcaster VRT
Digital Audio Tape (commonly referred to as DAT or R-DAT) has many known problems. One of its manufacturers warned back in 1994: "Ampex's position about archiving valuable source programming to R-DAT is simple. We do not recommend it.” Like many other broadcast stations, VRT has lots of DATs: in their case, an estimated 70,000 hours of DAT material, 10,000 of which have been transferred to digital computer wav files along with their metadata.
One way to transfer DAT audio is to use DDS drives and a specialized software such as DATXtract. VRT decided to do it in real-time, with intensive annotation, while listening for errors. Archivists later listen to the wav files and check again for technical glitches, while they add additional subject headings, etc.
Thankfully, Brecht reports only two DATs with complete failure so far (out of 10,000), and that many of the more common glitches had occurred during non-critical sections (e.g. during commercial music playback). Such a low number of failures may be due to the tapes having been originally recorded on high-quality professional decks, and to the tapes having enjoyed ideal storage conditions.
He did not show specific sound degradation results (later I encouraged him to publish results), and found no correlation between brand and deterioration in the VRT Archives. Interestingly, he mentioned how at one point in the chronology of the tapes errors seemed to increase. When he asked a VRT engineer, the engineer mentioned how the technical staff had been cut that year, so tape decks were maintained less frequently; so the errors were presumably recording errors, not errors due to degradation.
He also mentioned that there may be a correlation indicating that pre-1990 tapes are faring worse, but presented no specific evidence.
He mentioned a RAI study that shows how more metadata increases usage significantly (I have asked him and EBU for the citation).
Careful keeping of metadata (particularly quanitfying error reading) will be important for current and future priority studies.
Shane Beers & Bria Parker | University of Michigan | UNITED STATES Hathitrust and the Challenge of Digital Audio Hathitrust is an interesting repository hosted by the University of Michigan, but open to all institutions. For books (which comprise the vast majority of the 264-terabyte collection), it seems to use the ingest standards (“AIPs”) used by Google Books. They recently decided to do a test pilot for audio and found a lack of unified standards. For technical metadata, they are following the upcoming AES schema and Dublin Core, and for preservation metadata they use PREMIS within METS wrappers (METS creation is automated). Their AIP follows Indiana University’s Sound Directions suggestions for best practices. Format validation is done through JHOVE.
Stefano S. Cavaglieri & Gabriele Franzoso | Fonoteca Nazionale Svizzera | SWITZERLAND Raising the Quality Bar in Re-Recording Mr Cavaglieri reported measured differences in playback quality (speed accuracy, crosstalk, phase correlation, and frequency response) between a common Technics 1200 turntable fitted with a Shure V-15 Mk IV versus a more expensive model, the SME Model 10 with an Ortofon Bronze cartridge. The results were not surprising, and easily perceived. A similar setup was used to compare standard factory Studer a807s vs those with modifications commonly mentioned on the internet (mostly related to output op-amps). Again, differences were detected; but, interestingly, numbers seemed to show that the factory options performed best.
Four A/D converters were also analyzed: The Apogee Rosetta; an entry-level converter (no longer manufactured) by NOA (probably replaced by this one); the RME Fireface; and the Weiss ADC. Mr. Cavaglieri had little time to show relevant results. Audio analysis was performed using RightMark software.
Feigning despair, Mr. Cavaglieri ended up with a quote from architect Mario Botta about preservation, in which he claims that the only form of preservation is to let the building fall (!).
Nadja Wallaszkovits & Dr. Peter Liepert | Phonogrammarchiv | AUSTRIA Digitisation of Highly Degraded Acetate Tapes – A Treatment Report
The Phonogrammarchiv in Vienna is no stranger to deteriorating acetate tapes, and some recent projects in that field yielded some unexpected results –for example, the fact that brittle acetate tapes seemed to play back better in a relatively high-temperature and -humidity environment. However, the archive was not quite ready for the shocking state of decay exhibited by a set of recently received tapes from an (undisclosed) country with tropical climate. One tape had shrunk to such a degree that it looked like a disc inside the reel.
Chemical analysis revealed that these tapes were Agfa triacetate, probably manufactured in the late 1940s, but recorded in the 1970s.
Ongoing research at the Phonogrammarchiv suggests that decaying acetate tapes may actually suffer from two parallel processes:
- The well-known acetate breakdown known as vinegar syndrome, which is accelerated by heat and moisture;
- An evaporation of plasticizers, which reduces the tape’s flexibility and causes it to eventually become brittle.
Interestingly, these highly degraded tapes seemed to no longer suffer from vinegar syndrome, simply because the second process was so advanced that most of the acetate was gone; the tapes were what Ms. Wallaszkovits called “fully cured”.
If the decaying process is truly two-pronged, this could have potential repercussions for the storage of all cellulose acetate materials, including most lacquer discs, since, according to Ms. Wallaszkovits, water can act as a plasticizer and compensate for the plasticizer loss: She pointed out that acetate discs that are in too dry an environment may start cracking earlier.
The Phonogrammarchiv is developing a method to physically restore severely damaged acetate tapes. Although the formula is under development and thus still not public, it will be available once the testing period has successfully been carried out. The results are extremely encouraging, judging by the photographs and audio files presented by Ms. Wallaszkovits, although she pointed out that part of the reason for the high quality of the audio is that it was recorded relatively recently, on 1970s equipment.
In unrelated informal talks afterwards, Ms. Wallaszkovits mentioned that the Phonogrammarchiv is capturing bias information from tapes by digitizing at 192 kHz and splitting the signal. This information can be used to apply later digital restoration processes such as those of Plangent Processes, where easily-detectable fluctuations in bias speed are used to restore the audio signal. The Phonogrammarchiv reportedly is developing (or has developed) a similar process.
Ms. Wallaszkovits also mentioned that they adjust azimuth by tapping a frequency-selective voltmeter to the bias current output. When that voltmeter shows a maximum reading, you have the optimal azimuth. (She also mentioned that in their tests, playback equalization is best applied in the analog domain.)
New Attraction: PBCore 2.0 Proposed by: Chair: Karen Cariani, WGBH Educational Foundation Speakers: 1. Chris Beer, WGBH Interactive’s Open Vault 2. Courtney Michael, WGBH Educational Foundation 3. Jack Brighton, University of Illinois 4. Kara Van Malssen, PBS American Archive Project. 5. Katrina Dixon, Northeast Historical Film .
The Corporation for Public Broadcasting’s metadata schema, PBCore, is being revised. PBCore 2.0 should be released January 2011, with significant changes and improvements; they are currently working on their SIP (Submission Information packet) for the American Archive project, and are still open for suggestions. More web resources like http://www.americanarchiveinventory.org, which includes PBCore import/export, will be available in the near future; others include a site for helping with cataloging, http://www.pbcoreresources.org and a PBCore validator, http://pbcorevalidator.org/. Another good resource is http://www.opensourcearchiving.org, a blog on “open source solutions for audiovisual archives”.
Tech MD: Is there a Doctor in the House? Chair: Dave Rice, AudioVisual Preservation Solutions Speakers: 1. Kate Murray, FADGI 2. Hannah Frost, Stanford University 3. David Rice, AudioVisual Preservation Solutions.
Technical metadata will be essential for long-term preservation of digital files. FADGI (Federal Agencies Digitization Guidelines Inititative) has suggested a set of BWF guidelines to be used in the so-called BEXT Chunk of archival Broadcast Wave Files (BWF), while Stanford University is working on a successor to JHOVE (JSTOR/Harvard Object Validation Environment), JHOVE2. This is a project from the Library of Congress’ NDIIPP, carried out by the California Digital Library, Portico, and Stanford University, and it uses DROID (Digital Record Object Identification) and the British National Archives’ PRONOM directories extensively. JHOVE2 could be a very useful tool indeed. Another one could be Dave Rice’s metadata aggregator tool, “lovingly called” FATMAP, which can “crawl” a digital collection and mine technical data to present in meaningful ways. As a test, the program crawled the video holdings of the Internet Archive, and the results presented fascinating analysis of the last 20 years of submissions to that web site, plotting the rise and fall of popularities of file formats, languages used, places of submission, and even aspect ratios.
Mike Casey | Indiana University | UNITED STATES Strategic Evaluation of Media Collections: The Indiana University Bloomington Media Preservation Survey
The issues faced by Indiana University are faced by audiovisual archives around the world, and are aptly summarized in their media preservation survey. IU is now getting ready to act on the discouraging results. For example, it would take them 120 years to complete reformatting at their current rate. Indiana University is therefore considering “massive, rapid, and considered” reformatting of their media collections. They are hiring a campus-wide Media Preservation Specialist, and have already hired a Film Archivist. IU is now focusing on prioritization guidelines (even evaluating Columbia University’s AVDb), while utilizing parallel ingestion, and creating a repository infrastructure with the help of AVPS.
Mr Charles A. Richardson | Richardsons Magentic Tape Restoration LLC | UNITED STATES Rethinking Triage and Preservation of Analog Media Collections
Not without controversy, Mr. Richardson believes that the carbon backcoating of some tapes (not the binder as is commonly thought) is to blame for sticky-shed syndrome (this appears to be at least statistically reasonable). In his view, the common method for dealing with sticky shed (incubating (a.k.a. baking) the tape) contradicts the archival principle of “do no harm”, since tapes do not appear to tolerate repeated bakings well. Sticky shed is after all mostly a friction problem, so much of Mr. Richardson’s talk focused on tribology, the science related to friction. A potentially useful ,if perhaps not very accurate, method to measure friction on tapes involves using a portable Nagra recorder to read the current draw in its meter. Presumably, the more current the recorder is utilizing, the more friction the tape is presenting.
Jim Wheeler of Wheeler Tape Forensics, the original developer of the Ampex patent for incubating tapes with sticky shed syndrome (which, incidentally, technically voided Ampex’s warranty for the tape because of the high temperature involved) and Jim Lindner of Media Matters mentioned the fact that Mr. Richardson needs to present his studies in a peer-review journal, and give details on his patented system which supposedly “scrapes” the carbon black from the oxide side, and which may also not conform to the “do no harm” principle.
Learn more about the conference here.