With services like Netflix and similar service sometimes you get a bum DVD that doesn’t quite work as well as it should — especially if you’re trying to play it on a computer.
If some blocks are unreadable things get really stuttery and slow. The thing is that if you were able to simply skip over the bad sector then things would probably work just fine anyway — you might miss a line of dialog, but from a movie perspective things are probably fine.
The first thing to do is to clean it off. If that doesn’t work I typically take a very fine automotive rubbing compound to try to smooth out any micro imperfections.
But sometimes nothing works.
A few years back I write a C# program that did the dirty work of reading the DVD files and, if it couldn’t read some portions, it would skip them and fill in nulls. For the most part DVD rippers seem to deal with that type of loss pretty well. Like I said, you might lose a few seconds, or maybe even a few minutes, but at least you can watch the movie.
#!/usr/bin/env python import sys, os, os.path infilename = os.path.expanduser(sys.argv) outfilename = os.path.expanduser(sys.argv) chunksize = 128 * 1024 print "%s -> %s" % (infilename, outfilename) input = open(infilename, 'rb') output = open(outfilename, 'wb') filesize = os.stat(infilename).st_size; currentloc = 0 goodblocks = 0 badblocks = 0 while (currentloc < filesize): try: chunk = input.read(chunksize) if not chunk: break; print "\r%d" % currentloc, sys.stdout.flush() output.write(chunk) currentloc += chunksize goodblocks += 1 except: print "\nError @ %d" % currentloc sys.stdout.flush() if (filesize - currentloc) > chunksize: bytes = chunksize else: bytes = filesize - currentloc output.write("\0" * bytes) currentloc += chunksize input.seek(currentloc) badblocks += 1 input.close() output.close() print "Complete! Total size: %d. Good blocks: %d. Bad blocks: %d." \ % (filesize, goodblocks, badblocks)
The theory of operation is really simple.
You pass in two arguments:
- The source file
- The destination file
It just reads block of “chunksize” from the DVD and writes them to the destination. If there’s a read error it just skips the entire chunk.
Here’s the question you’re probably thinking: “Why not try sector by sector?”
Because it’s way too freakin’ slow! Every error takes a few seconds to happen. And errors seem to happen over vast (megabytes at a time) segments of the disc.
Some things that can be done to improve things:
- Make the chunk-size adaptive to try to deal with small errors as well as big ones a bit more gracefully. Trade of a bit of time (but not too much) for more fidelity.
- Allow the output to be standard out — this would allow you to pipe the output to something like DeCSS to decrypt things on the fly if needed.