Tar czpvf backup.tar.gz folder #backup.tgz is acceptable as well tar cjpvf backup.tar.bz2 folder #backup.tbz2 works too If you want to have a tar file you can 'update' package the tar using the P flag: tar cpPvf backup.tar folder Then to update, replace 'c' with 'u' and when unpacking, you can use 'k' to preserve files that already exist. Hi man if YOU r windows xp or vista 0r 98 user then.dmg is mac os file extension like.exe in windows according to my knowledge.tar.gz are compressed file like zip, winrar.
This question has been asked before, but I知 not too sure on the answer.
What I have and why I want to do it.
I have a copy of FreeDB and I am looking to convert it into an SQL database. This is being done on a Windows platform, but I do have several Linux systems. I am going to write a program to read all of the files (several million of them), but I estimated that I will need about 80GBs of space to extract a 4.5GB tar file because of the file count & size. This will give my poor Windows server a fit (and make a mess of my FAT table at the same time).
To get around this, I am looking to convert the file to an ISO (another linear file system), and mount the ISO as a drive.
If I am unable to do this, I will try to write in some support for the TAR format and read the file directly. But that takes time, and I値l have to deal with hard linking... Not something I would look forward to.
I am fairly new to Linux, so an explanation of any commands and piping would also be welcomed with open arms.
Also, will ISO even handle that many files?
Tar, (short for tape archiver), is a versital tool that can be used for archiving files to disk or any other device as easily as tape. In fact, if you don't work in a data center, you will probably never use tar with a tape drive.What I have and why I want to do it.
I have a copy of FreeDB and I am looking to convert it into an SQL database. This is being done on a Windows platform, but I do have several Linux systems. I am going to write a program to read all of the files (several million of them), but I estimated that I will need about 80GBs of space to extract a 4.5GB tar file because of the file count & size. This will give my poor Windows server a fit (and make a mess of my FAT table at the same time).
To get around this, I am looking to convert the file to an ISO (another linear file system), and mount the ISO as a drive.
If I am unable to do this, I will try to write in some support for the TAR format and read the file directly. But that takes time, and I値l have to deal with hard linking... Not something I would look forward to.
I am fairly new to Linux, so an explanation of any commands and piping would also be welcomed with open arms.
Also, will ISO even handle that many files?
Often, Unix/BSD/Linux files and source code are distributed in a zipped tar file, sometimes called a tarball. Extensions for tarballs are usually .tgz or .tar.gz (gz because it was compressed using gzip, the free GNU zip program). Rarely, you may run across a tar file that is not compressed and has an extension of simply .tar.
Here are some common uses for tar. If you pass a directory or a wildcard to tar, it will include all subdirectories in the tar file by default.
Create a gzipped tar archivetar czvf backup.tgz files-to-backup
Create a gzipped tar archive, preserving file permissions
tar czvpf backup.tgz files-to-backup
Java Dmg Or Tar.gz
Extract a gzipped tar archivetar xzvf backup.tgz files-to-backup
Create a bzipped tar archive (using bzip compression instead of gzip)
tar cjvf backup.bz2 files-to-backup
Extract a bzipped tar archive
Dmg Vs Tar Gz
tar xjvf backup.bz2 files-to-backupList files in a tar archive without extracing
tar tf backup.tgz
List files in a zipped tar archive without extracing