We hear a lot about “high-res audio” these days. Sound digitized at 192,000 samples per second must be a lot better than the usual 44,000, right? Well, maybe not.
We can hear sounds only in a certain frequency range. The popular rule of thumb is 20 to 20,000 Hertz, though there’s a lot of variation among people. Not a lot of people can hear anything higher than 20,000.
Posted in commentary
The 3-D printing industry has been moving toward 3MF as a standard file format. It’s an XML-based format that claims to offer extensibility, interoperability, and freedom from the problems of other formats. The specification includes an XSD schema. I’m no judge of how suitable it is for 3D modeling, but yes, it is extensible. In fact, it’s designed with a relatively lean core model, so additional features can be added as extensions.
A recent Fortune article, “Why These Big Companies Want a New 3D File Format”, discusses 3MF from a business standpoint.
The old STL format, based on tessellation, is widely used, but it’s been criticized for generating huge files and lacking features.
Posted in News
Tagged 3D, printing, XML
A change after a version number’s decimal point is usually minor or moderate, but the creators of the EPUB 3.1 draft at IDPF call their changes from 3.01 “radical.”
Posted in News
Tagged html5, ePub, IDPF
Today I came across a video from the Library of Congress on “Why digital preservation is important for you.” Anyone following its advice will certainly have a better chance of keeping their files alive and organized for a long time. The only question is: Who’s going to follow that advice?
Recently I came across the term “fuzzing” for intentionally damaging files to test the software that reads them. Most of the material I’ve found doesn’t provide a useful introduction; they assume that if you know the term, you already understand something about it. One good article is “Fuzzing — Mutation vs. Generation” on the Infosec website. According to that article, fuzzing denotes the response to file changes rather than the changes themselves, but I’m seeing the term used mostly in the latter sense.
As you’ve doubtless notice if you follow this blog or my Twitter feed, I’ve made two video courses and put them up on Udemy.com. You may be wondering why I’m doing this, especially if you know how much I hate being on camera.
Several steps have led to my being here. One is that the more gray hair you have, the more likely clients and employers are to assume the gray matter has leaked out of your brain, even though that’s nonsense. So I have to find other sources of income. I’ve been doing writing, including the book Files that Last, and having some successes there. Many people, though, like video learning, and turning written material into video presentation isn’t a huge step. I liked the arrangements Udemy offered, so I’ve given it a try.
Posted in Personal
Tagged Udemy, video
The Sächsische Landesbibliothek – Staats- und Universitätsbibliothek Dresden (Saxon State and University Library Dresden), which somehow gets abbreviated to SLUB, has developed a tool for working with TIFF files in digital preservation. fixit_tiff is a command line utility, written in C, which can do some repairs on defective TIFF files. The focus appears to be on correcting common errors, not on repairing corrupted files. A blog post from July (in German) indicates it can do configurable validation using a simple query language.
It’s available under the same license as Libtiff. Just what is that license? The only thing I can find is a very outdated “Use and Copyright” statement, which is on a page so old it warns about patents on LZW compression. It’s available for free, anyway.
The British Library’s Digital Preservation Team has issued a report on WAV Format Preservation Assessment. It cites the broad adoption of WAV and its extension BWF (Broadcast Wave Format) as a positive for preservation purposes and offers only a few cautions. I’m flattered by the recommendation, “Wherever possible and appropriate to the workflow, submitted content should be validated using JHOVE.”
The nineties saw huge changes in personal computing, as operating systems became more complex, Internet connections became common, and the World Wide Web appeared. This meant a lot of instability as formats came and went.
This past weekend I discovered a CD-ROM in my closet with the production files for a small-run songbook, The Pegasus Winners (optimistically called “Volume 1”), that I produced in 1994. The good news is that the CD is still readable. The bad news is that I can’t read most of the files. The not-so-bad news is that I could probably recover them with moderate effort.
NASA is using a format for online files, called MRF (Meta Raster Format), which is claimed to deliver images ten times as fast as JPEG2000 from cloud services when used with a compression algorithm called LERC. LERC is under patent by Esri, which says the technique is especially suited for geospatial applications and makes the algorithm “freely available to the geospatial and earth sciences community.” An implementation of MRF from NASA is available on GitHub under the Apache license, and an implementation of LERC is on GitHub from Esri.
Posted in News
Tagged images, software