Dec 132017
 
Several messages concerning Katz's new Pakker.
File NEWPKPAK.ZIP from The Programmer’s Corner in
Category Tutorials + Patches
Several messages concerning Katz’s new Pakker.
File Name File Size Zip Size Zip Type
NEWPKPAK.TXT 18875 7155 deflated

Download File NEWPKPAK.ZIP Here

Contents of the NEWPKPAK.TXT file


A group effort is *well* underway to develop a new archiving standard which
will have many new features and will be more efficient than present methods.

The following is an edited capture file from a session conducted on the
Exec-PC Business Board 414/964-5160 on Saturday September 17, 1988 by
Keith Petersen, W8SDZ.

conf: FILE COMPRESSION FORUM #1544 08-18-88 05:29 (Read 102 times)
from: DEAN COOPER
to: GRANT ELLSWORTH (Rcvd)

[speaking about a file recently uploaded which urges everyone to convert to
the DWC archiver]... The author of the note included in the file was quite
misinformed as he thought Phil would no longer be able to develop ANY
archivers after January 89. But of course, Phil is allowed and is going to
come out with a NEW archiver, it just must not use the same file format as
ARC does.
Also, the author of the note basically wanted people to switch over to
DWC, and I have heard other people say the same thing. But I would caution
people to just be a bit patient and wait for Phil's new archiver. This is
an easy thing for me to say since both Phil and I are combining are efforts
on this new archiver... So just stay tuned...
Dean

conf: FILE COMPRESSION FORUM #1550 08-19-88 21:55 (Read 103 times)
from: GRANT ELLSWORTH
to: DEAN COOPER (Rcvd)

Dean, with you and PK teamed ... I can't think of a better possibility or
probability for a superior compressor/archiver utility .... I WILL stay
tuned! Regards, Grant

conf: FILE COMPRESSION FORUM #1552 08-20-88 20:28 (Read 99 times)
from: PAUL ZIMMERMAN
to: DEAN COOPER (Rcvd)

Is this going to be a "new standard"??? Either your and Phil's work will be
merged into a single entity or the two programs will understand each
other's formats and not throw tantrums? Sounds very good. It will be hard
to be patient not knowing what is going on....
Paul

conf: FILE COMPRESSION FORUM #1553 08-21-88 00:13 (Read 101 times)
from: PHIL KATZ
to: PAUL ZIMMERMAN (Rcvd)

Paul,

Yes, it will be a completely new standard. With Dean's experience
with DWC and my experience with (name deleted to protect the innocent)
combined, we will be able to come out with something much better
than the current software, both in terms of compression performance
and functionality.

>Phil>

conf: FILE COMPRESSION FORUM #1555 08-21-88 19:45 (Read 96 times)
from: PAUL ZIMMERMAN
to: PHIL KATZ (Rcvd)
cc: DEAN COOPER

One uncertainty remains. Who will write a "converter" from ARC format to
your new one? Dean? Or someone "anonymous"??? (Probably one of those
Macrobiotic Programmers I heard about in Bull Roar? 🙂
Paul

conf: FILE COMPRESSION FORUM #1556 08-21-88 22:32 (Read 98 times)
from: PHIL KATZ
to: PAUL ZIMMERMAN (Rcvd)

Paul,

Well, who writes a converter isn't a major issue. Basically,
all that will be necessary to convert to the new format is
a program to alternately invoke PKUNPAK to extract the files
and then thew new software to recompress them into the new
format. Since the new software will be using different and
better compression algorithms, it will be necessary to do
this.

Anyway, a person in New York, acting on his own cognizance, has
volunteered to write a conversion program.

>Phil>

conf: FILE COMPRESSION FORUM #1557 08-21-88 22:43 (Read 96 times)
from: PAUL ZIMMERMAN
to: PHIL KATZ (Rcvd)

Yes, I suppose a batch file with some clever application of the
"for" command could do the trick. But I was wondering if something
more exotic might pop-up.

I am VERY curious about the "new and better" methods. Any hints? Going
to full 16 bit tables, or what??? (I have heard that this requires a
lot of RAM, but who nowadays doesn't have at least 512k???) And will it be
possible to tell the new compressor to "optimize" by examining a file
at full length and the building the best possible compressed file?

Paul

conf: FILE COMPRESSION FORUM #1558 08-21-88 23:39 (Read 98 times)
from: PHIL KATZ
to: PAUL ZIMMERMAN (Rcvd)

Paul,

Well, I don't want to spill all my beans, but I have a prototype
compression algorithm that uses *less* memory than PKPAK, but
consistently compresses better. I am also evaluating an algorithm
that can compress much, much better than the current methods,
but takes a long time to compress. The extraction isn't too
bad though. This might be included as an option if you want
to maximize the compression achieved.

I think something like a 16-bit code is pretty much out of the
question. Even though the current software will run in 128K,
there are still applications where there isn't that much
available. That is the main reasons for the junior versions
of PKPAK and PKUNPAK currently. Especially when compression
is integrated into other applications, memory usage is a
major concern. One of the goals for the new software is to
be able to run in about the same amount of memory (or less)
than the current software, at least in the MS-DOS versions.

>Phil>

conf: FILE COMPRESSION FORUM #1559 08-22-88 05:53 (Read 99 times)
from: DEAN COOPER
to: PAUL ZIMMERMAN (Rcvd)

The new archiver will be a NEW format and standard. We are trying to
put in all the things that were to difficult with our older formats, so if
you have any ideas for features or things you've always wanted in an
archiver, speak up now, and if we havn`t thought of it already, we may be
able to work it in since we're starting from scratch in an attempt to do
things right...
Dean

conf: FILE COMPRESSION FORUM #1589 08-30-88 05:59 (Read 89 times)
from: DEAN COOPER
to: PATRICK LEMIRANDE (Rcvd)

The program will be designed as modular as possible, so that we will end
up with a library of routines that do compression/decompression, a library
of routines that can manipulate an archive file (all the basic things that
the user can do from the normal command line program), and then a front end
that puts a command line interface on to the top. Since many people,
including myself, like a command line program, and since it will be no big
deal to make one seeing that it is just a small part on top of the library
that does all the work, then we have a command lie version of the program.
But of course, other people like the full-screen interface, so that will
be done too...
Dean

conf: FILE COMPRESSION FORUM #1601 08-30-88 23:55 (Read 95 times)
from: PHIL KATZ
to: PATRICK LEMIRANDE (Rcvd)
cc: DOUGLAS HAY

Patrick,

Douglas Hay is working with us to develop a menu driven full screen
front end for the new software. This will be something integrated
into the design, and will have self-contained compression/extraction
routines so it won't need to shell to other programs. Of course,
there will also be command line driven versions of the software
available for use in automated procedures and batch files etc.

>Phil>

conf: FILE COMPRESSION FORUM #1576 08-29-88 00:59 (Read 89 times)
from: PHILIP BURNS
to: PHIL KATZ (Rcvd)
subject: NEW COMPRESSION PROGRAM

cc: DEAN COOPER

I see from messages here and elsewhere that you guys are working
on a new file compression/librarying program to replace the PKPAK
and PKUNPAK programs.

Many of us are looking for a replacement for ARC, partly because of
its MS-DOS based limitations (short file names, no directory
information, no indication of file type, etc.), and partly because
of the current insistence of SEA that the ARC file type is now
proprietary and ANY program which processes an ARC file in any form
requires a license from SEA. This license condition is completely
unacceptable.

My questions on your new work are these:

(1) Will you be looking at issues of using the programs
on systems besides MS DOS and OS2? For example,

I'd like to be able to use the same/similar programs
to work on the same file across a variety of systems,
like Unix, the Macintosh systems, VAX/VMS, IBM CMS
and MVS, CDC NOS/VE, etc.

(2) Will you be making the COMPLETE file specification
public domain, or copyrighted but completely free
of licensing restrictions? By that I mean, do you
intend to allow by default, anyone to process a
file in your new format without licensing restrictions?
(If not, I certainly wouldn't use your new programs,
as this would lead to the same silliness as currently
exists regarding ARC).

Thanks.

-- Pib
---------------

conf: FILE COMPRESSION FORUM #1577 08-29-88 05:34 (Read 87 times)
from: DEAN COOPER
to: PHILIP BURNS (Rcvd)

How long to file names need to be?? And in what way would you like to
have a long file name truncated down to DOS size? Also, what file types
are there? On these systems that have different file types, can one use C
functions to create the various types? If so, what's the typical arguments
needed? Do different file types need to be written out to differently?
Dean
P.S. The new archiver's format will be made public.

conf: FILE COMPRESSION FORUM #1579 08-29-88 07:55 (Read 88 times)
from: MIKE SHAWALUK
to: PHILIP BURNS (Rcvd)

Philip,
I am the co-developer for the VAX/VMS version of "our" new archiving
utility (no bruised ego here, but I just wanted to say that it's not just
Dean and Phil who are involved in the "project"). Anyways, one of the
things I am wrestling with on the VMS version is the wide variety (i.e.,
endless number) of file and/or record types available under that operating
system, and how to deal with them, both when adding files *on* a VMS
system, and when extracting them, whether under VMS or on a foreign system.
If you have any comments/suggestions/whatever, let me know, either via a
posting here, or via email, if it's more convenient.
- Mike

conf: FILE COMPRESSION FORUM #1581 08-29-88 22:54 (Read 86 times)
from: GRANT ELLSWORTH
to: DEAN COOPER (Rcvd)
cc: PHIL KATZ
cc: PHILIP BURNS

Dean, are you all going to provide enough info in the public declaimation
for a reasonably experienced pgmr to write code which will be able to
extract files and decompress them? I don't think that files can be
unCrucnched, unPacked, unSqueezed by programmers without knowledge of the
ARC source code ... and unSquashing could probably be inferred from
PK's general doc on the Squash Info file.

And the public DOC on the .ARC files doesn't seem to say too much beyond
how to get to the heaader part for each file and ident its name/date

Re: file name sizes ... for PC DOS, UNIX, VMS and other relatives (like
OhS.../2), allow up to 63 for storing fully qualified path_names; for IBM
maiframes ... MVS tolerates 44 for full DSName + 10 for library/member
identifier and delimiters "()".

Re: non-ascii systems - e.g. IBM MAINFrames ... others ... similar or
anologous file structures you can create ... but you want to CODE stuff
to play games with reversing the order of bytes in continous stream bit
streams????? I've done it for moving special bit-stream files from heavy
metal to little-iron ... it ain't fun ... but it can be done

Grant

conf: FILE COMPRESSION FORUM #1582 08-29-88 23:16 (Read 88 times)
from: GRANT ELLSWORTH
to: MIKE SHAWALUK (Rcvd)

Mike, here is a suggestion for addressing the high variety of file types
on VMS (incl. cr/lf, lf ..... cr, var-lgth, fixed lgth, blocked, etc). I
used a similar technique for doing the almost as high a variety on heavy
metal ,,, get the file-type, block0size, record-length, etc..
characteristics out of the "FCB" (? - i've forgotten the VMS control block
name --- and I'd have to let my line go out to get it out of my manuals in
the boxes on the other side of the basement), and store that encoded info
in the compressed/library filename header exactly as VMS expects to find
it. Then, when you compress the file, compress EVERYTHING --- even the
line delimters , the record length descriptor byte/word, etc., as if it
were a continuous stream of bytes ---- strip nothing! On decompress cycle,
build a large stream of output bytes in LARGE blocks.... and have the
decompressor driver call a different output writer for each general class
of file type (many of the specific file types can be grouped together in
one class --- e.g. var-lgth blocked and var length unblocked records -=--
note --- do not try to compress by total var lgth blocks --- just use the
records ---- there is no need to carry the full block structure thru the
compressor/decompressor cycle --- the FCB characteristics should suffice)
Note: You can insert the file's characteristics (record length, record-
type, block-size, etc..) in the FCB before opening the file ... dittor
for the file's name (which heavy-metal will NOT let you easily do - but
VMS does!)

Hope this is helpful. Grant

conf: FILE COMPRESSION FORUM #1586 08-30-88 03:54 (Read 89 times)
from: PHILIP BURNS
to: MIKE SHAWALUK (Rcvd)

Thanks for the message.

On Vax file types: for the Vax, one can store the relevant
RMS specs with the file, and have user exits available to
recreate the file with the proper specs, as part of the
archiver. Or you might consider converting non-text files
to VMSHEX form, and then VMSDEHing them upon extraction.
Or perhaps that should just be left up to the user. However,
the main thing is to allow enough lines of commentary to
be associated with an entry so that someone could figure
out what to do.

Similar things can be done on other systems which have
the same variety (some even MORE variety) of file types
than VMS.

-- Pib

conf: FILE COMPRESSION FORUM #1597 08-30-88 21:50 (Read 91 times)
from: GRANT ELLSWORTH
to: DEAN COOPER (Rcvd)
subject: LZW COMPRESSION
cc: PHIL KATZ

Dean, and Phil, do you really think that you've developed another
compression which is faster and produces a higher compresion than LZW? Is
LZW now as dated as SQz as a commonly used compression method? Grant
---------------

conf: FILE COMPRESSION FORUM #1604 08-31-88 00:13 (Read 92 times)
from: PHIL KATZ
to: GRANT ELLSWORTH (Rcvd)
subject: R: LZW COMPRESSION Reply to #1597
cc: DEAN COOPER

Grant,

Well, I'm not really at liberty to talk about this too much
right now. There are algorithms that can compress much
better than any ZLW implementation that I have seen, but they
also are much slower too. The trick I guess is then to have
your cake and eat it too.

I will also go out on a limb here, and say that the "conventional" ZLW
implementations that I have seen are quite inefficient in effectively
re-using or re-assigning codes when the table is full.

Anyway, I don't think that ZLW is about to be taken over like
SQueezing was. Also, I think that perhaps some of the better
refinements of SQueezing have been overlooked and that SQueezing
may not be entirely dead. In any event, I think that future
techniques might incorporate ideas from SQueezing, ZLW, arithmetic
encoding, and others, combined synergistically.

>Phil>

conf: FILE COMPRESSION FORUM #1609 08-31-88 19:19 (Read 97 times)
from: THOMAS ZERUCHA
to: PHIL KATZ (Rcvd)
subject: NEW PKPAK/UNPAK

I for one would really appreciate it if you would write an early version of
*just the unpacker* in a very portable type of C so that the rest of us
could get it running very quickly for any arbitrary non-pc system.
Otherwise I hope you have plans to port it to *everything* in existance.
If there is one thing which the current ARC has over a potential new system
is PD source so it can be made to work (with some effort) on any machine.
---------------

conf: FILE COMPRESSION FORUM #1611 08-31-88 19:59 (Read 103 times)
from: PHIL KATZ
to: THOMAS ZERUCHA (Rcvd)
subject: R: NEW PKPAK/UNPAK Reply to #1609

Thomas,

>>If there is one thing which the current ARC has over a potential new system
>>is PD source so it can be made to work (with some effort) on any machine.

I wouldn't exactly say that!! It is the contention of one New Jersey
company that anything that deals with an ARC file in any manner
requires a license from them, and any software that is even similar
to theirs when played backwards at 1/2 speed had pretty darn better
be licensed with them. They claim that the file format is proprietary,
and definitely not Public Domain.

On the other hand, the format for the new software that PKWARE is
developing will be entered into the public domain, with no restrictions
placed on other programs that read these new files.

>Phil>

conf: FILE COMPRESSION FORUM #1612 08-31-88 20:06 (Read 110 times)
from: PHIL KATZ
to: JIM DUNNIGAN (Rcvd)
subject: PLANS

Jim,

Well, I think it's a little to early for an official product
announcement or anything, but here's what is currently in
the works:

-------------------------------------------------------------
Caveat:
This is not an official press release. This is what
is currently planned for the new data compression
software forthcoming from PKWARE. The information
provided here is subject to change without notice.

o The file format for the new files will be made public.
Other software can read or write these files without
restriction. Additionally, PUBLIC DOMAIN source code
written in portable C language may be released to
demonstrate how an applications program can read the
information contained in a file created with the new
software.

o The software will be concurrently released for MS-DOS,
VAX/VMS, and Amiga. (This implies that long filenames
such as on an Amiga or Unix will be fully supported.)
OS/2, Unix, and mainframe development is currently being
investigated.

o The (MS-DOS) software will offer both a menu-driven full
screen interface and a command line interface.

o The software will provide significantly better compression
than the current software, and also offer vastly improved
reliability for recovering data from damaged and corrupted
files.

o The software will be able to process and traverse
subdirectories. The (MS-DOS) software will be able to span
multiple disks with compressed file collections.
---------------


 December 13, 2017  Add comments

Leave a Reply