permacomputing

Source repository for the main permacomputing wiki site
git clone http://git.permacomputing.net/repos/permacomputing.git # read-only access
Log | Files | Refs

commit 4c23f762731fe7d5ee6834b03514045b459afbd5
parent b68e56b597f94124959879f3b93af97273256a7c
Author: neau <neau@web>
Date:   Wed, 22 Jun 2022 16:18:47 +0200

empty web commit

Diffstat:
Adigital_preservation.mdwn | 25+++++++++++++++++++++++++
1 file changed, 25 insertions(+), 0 deletions(-)

diff --git a/digital_preservation.mdwn b/digital_preservation.mdwn @@ -0,0 +1,25 @@ +## Techniques + +[[Bitstream Copying]]: is more commonly known as "backing up your data," and refers to the process of making an exact duplicate of a digital object. + +[[Migration]]: to copy digital information from one long-term storage medium to another of the same type, with no change whatsoever in the bitstream. Migration theoretically goes beyond addressing viability by including the conversion of data to avoid obsolescence not only of the physical storage medium, but of the encoding and format of the data. + +[[Persistent Media]]: a medium like a gold disk, may reduce the need for refreshing, and help diminish losses from media deterioration, as do careful handling, controlled temperature and humidity, and proper storage. + +[[Technology Preservation]]: is based on preserving the technical environment that runs the system, including operating systems, original application software, media drives, and the like. It is sometimes called the "computer museum" solution. + +[[Digital Archaeology]]: includes methods and procedures to rescue content from damaged media or from obsolete or damaged hardware and software environments. Digital archaeology is explicitly an emergency recovery strategy and usually involves specialized techniques to recover bitstreams from media that has been rendered unreadable, either due to physical damage or hardware failure such as head crashes or magnetic tape crinkling. + +[[Analog Backups]]: an analog copy of a digital object can, in some respects, preserve its content and protect it from obsolescence, while sacrificing any digital qualities, including sharability and lossless transferability. Text and monochromatic still images are the most amenable to this kind of transfer. + +[[Replication]]: In each case, the intention is to enhance the longevity of digital documents while maintaining their authenticity and integrity through copying and the use of multiple storage locations. + +[[Normalization]]: is a formalized implementation of reliance on standards. Within an archival repository, all digital objects of a particular type (e.g., color images, structured text) are converted into a single chosen file format that is thought to embody the best overall compromise amongst characteristics such as functionality, longevity, and preservability. + +[[Canonicalization]]: is a technique designed to allow determination of whether the essential characteristics of a document have remained intact through a conversion from one format to another. + +[[Emulation]]: combines software and hardware to reproduce in all essential characteristics the performance of another computer of a different design, allowing programs or media designed for a particular environment to operate in a different, usually newer environment. Emulation requires the creation of emulators, programs that translate code and instructions from one computing environment so it can be properly executed in another. + +[[Encapsulation]]: may be seen as a technique of grouping together a digital object and metadata necessary to provide access to that object. Ostensibly, the grouping process lessens the likelihood that any critical component necessary to decode and render a digital object will be lost. Appropriate types of metadata to encapsulate with a digital object include reference, representation, provenance, fixity and context information. Encapsulation is considered a key element of emulation. + +[[Universal Virtual Computer]]: is a form of emulation. It requires the development of "a computer program independent of any existing hardware or software that could simulate the basic architecture of every computer since the beginning, including memory, a sequence of registers, and rules for how to move information among them. Users could create and save digital files using the application software of their choice, but all files would also be backed up in a way that could be read by the universal computer. To read the file in the future would require only a single emulation layer—between the universal virtual computer and the computer of that time."