Category : Network Files
Archive   : NETBENCH.ZIP
Filename : NETBENCH.TXT

 
Output of file : NETBENCH.TXT contained in archive : NETBENCH.ZIP
Understanding and Using NetBench(TM) 2.10

LICENSE AGREEMENT FOR ZIFF-DAVIS' NETBENCH(TM) 2.1

READ THIS AGREEMENT CAREFULLY BEFORE USING THE SOFTWARE EMBODIED
IN THE NETBENCH(TM) 2.1 DISKETTE (OR, IF DOWNLOADED, IN THE
DOWNLOADED FILE(S)). Embodied in the NetBench 2.1 diskette
("diskette") (or, if downloaded, in the downloaded file(s)) is
the NetBench 2.1 computer program and related documentation (the
"Software"). Ziff-Davis Publishing Company, L.P., having a place
of business at One Park Avenue, New York, New York 10016 ("Ziff")
is the licensor under this Agreement and you are the licensee.
By using the Software, in whole or in part, you agree to be bound
by the terms of this Agreement. If you do not agree to the terms
of this Agreement, promptly return the Software (or, if
downloaded, delete the Software) to the Ziff-Davis Benchmark
Operation at One Copley Parkway, Suite 510, Morrisville, North
Carolina 27560. Title to the Software and all copyrights, trade
secrets and other proprietary rights therein are owned by Ziff.
All rights therein, except those expressly granted to you in this
Agreement, are reserved by Ziff.

1. Limited License

This Agreement grants you only limited rights to use the
Software. Ziff grants you a non-exclusive, non-transferable
license to use the Software on a file server networked with
multiple PC computers for the sole purpose of conducting
benchmark tests to measure the performance of computer hardware
and operating system configurations. You have the right to make
a single copy of the Software for archival purposes and the right
to transfer a copy of the Software across a network only to the
PC computers attached to the network.

You may not publish or distribute benchmark test results obtained
by you from your use of the Software without prior written
permission from Ziff-Davis. For such permission, contact the
Ziff-Davis Benchmark Operation at the above address.

This Agreement and your rights hereunder shall automatically
terminate if you fail to comply with any provision of this
Agreement. Upon such termination, you agree to cease all use of
the Software, to delete the Software and to destroy all copies of
the diskette and other materials contained in this package in
your possession or under your control, or, if downloaded, to
destroy any and all copies of the Software in your possession or
under your control.

2. Additional Restrictions

A. You shall not (and shall not permit other persons or
entities to) rent, lease, sell, sublicense, assign, or otherwise
transfer the Software or this Agreement. Any attempt to do so
shall be void and of no effect.

B. You shall not (and shall not permit other persons or
entities to) reverse engineer, decompile, disassemble, merge,
modify, include in other software or translate the Software, or
use the Software for any commercial purposes, except for the
publication or distribution of test results with Ziff's prior
written permission, as provided above.

C. You shall not (and shall not permit other persons or
entities to) remove or obscure Ziff's copyright, trademark or
other proprietary notices or legends from any of the materials
contained in this package or downloaded.

3. Limited Warranty and Limited Liability

THE SOFTWARE IS PROVIDED "AS IS" WITHOUT WARRANTY OF ANY KIND,
EITHER EXPRESS OR IMPLIED, INCLUDING, WITHOUT LIMITATION, ANY
WARRANTY OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE.
THE ENTIRE RISK AS TO THE RESULTS AND PERFORMANCE OF THE SOFTWARE
IS ASSUMED BY YOU, AND ZIFF ASSUMES NO RESPONSIBILITY FOR THE
ACCURACY OR APPLICATION OF OR ERRORS OR OMISSIONS IN THE
SOFTWARE. IN NO EVENT SHALL ZIFF BE LIABLE FOR ANY DIRECT,
INDIRECT, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING
OUT OF THE USE OR INABILITY TO USE THE SOFTWARE, EVEN IF ZIFF HAS
BEEN ADVISED OF THE LIKELIHOOD OF SUCH DAMAGES OCCURRING. ZIFF
SHALL NOT BE LIABLE FOR ANY LOSS, DAMAGES OR COSTS, ARISING OUT
OF, BUT NOT LIMITED TO, LOST PROFITS OR REVENUE, LOSS OF USE OF
THE SOFTWARE, LOSS OF DATA OR EQUIPMENT, THE COSTS OF RECOVERING
SOFTWARE, DATA OR EQUIPMENT, THE COST OF SUBSTITUTE SOFTWARE OR
DATA, CLAIMS BY THIRD PARTIES, OR OTHER SIMILAR COSTS.

THE ONLY WARRANTY MADE BY ZIFF IS THAT THE ORIGINAL PHYSICAL
MEDIA IN WHICH THE SOFTWARE IS EMBODIED AND WHICH IS DISTRIBUTED
BY ZIFF SHALL BE FREE OF DEFECTS IN MATERIALS AND WORKMANSHIP.
ZIFF'S ENTIRE LIABILITY AND THE USER'S EXCLUSIVE REMEDY SHALL BE
LIMITED TO THE REPLACEMENT OF THE ORIGINAL PHYSICAL MEDIA IF
DEFECTIVE. THE WARRANTIES AND REMEDIES SET FORTH HEREIN ARE
EXCLUSIVE AND IN LIEU OF ALL OTHERS, ORAL OR WRITTEN, EXPRESS OR
IMPLIED. NO ZIFF AGENT OR EMPLOYEE, OR THIRD PARTY, IS
AUTHORIZED TO MAKE ANY MODIFICATION OR ADDITION TO THIS WARRANTY.

SOME STATES DO NOT ALLOW EXCLUSION OR LIMITATION OF IMPLIED
WARRANTIES OR LIMITATION OF LIABILITY FOR INCIDENTAL OR
CONSEQUENTIAL DAMAGES; SO THE ABOVE LIMITATIONS OR EXCLUSIONS MAY
NOT APPLY TO YOU.

4. U.S. Government Restricted Rights.

The Software is licensed subject to RESTRICTED RIGHTS. Use,
duplication or disclosure by the Government or any person or
entity acting on its behalf is subject to restrictions as set
forth in subdivision (c)(1)(ii) of the Rights in Technical Data
and Computer Software Clause at DFARS (48 CFR 252.227-7013) for
DoD contracts, in paragraphs (c)(1) and (2) of the Commercial
Computer Software-Restricted Rights clause in the FAR (48 CFR
52.227-19) for civilian agencies, on in other comparable agency
clauses. The contractor/manufacturer is the Ziff-Davis Benchmark
Operation, One Copley Parkway, Suite 510, Morrisville, North
Carolina 27560.

5. General Provisions

Nothing in this Agreement constitutes a waiver of Ziff's rights
under U.S copyright laws or any other Federal, state, local or
foreign law. You are responsible for installation, management,
and operation of the Software. This Agreement shall be
construed, interpreted and governed under New York law. If any
provision of this Agreement shall be held by a court of competent
jurisdiction to be illegal, invalid or unenforceable, the
remaining provisions shall remain in full force and effect.


Trademarks

NetBench(TM), WinBench(TM), WinMark(TM), Winstone(TM), PC Bench(TM),
DOSMark(TM), and MacBench(TM) are trademarks and WinBench(R) is a
registered trademark of Ziff-Davis Publishing Company, L.P.

Microsoft(R) and MS-DOS(R) are U.S. registered trademarks and Windows(TM)
is a trademark of Microsoft Corporation.

NetWare and Novell are registered trademarks of Novell, Inc Corp.


Preface

This manual tells you how to install, set up, and run NetBench(TM).

NetBench is one of several benchmark programs developed by the Ziff-Davis
Benchmark Operation to help you test your PCs, servers, and Macintoshes. For
information on getting copies of other benchmarks or additional copies of
NetBench, see Appendix F.


Notes about NetBench

WARNING
The user is cautioned against running these tests on a "live" production
network. NetBench is designed to exercise the network in various ways.
Running NetBench should be done in isolation when other user activity on the
network is non-existent. Results would be meaningless if the network were
active with user tasks.

Please note the following before you run NetBench:
1. You must read and agree to the license information in the front
of this manual before running NetBench. The same licensing
information appears on your PC's screen when you first run NetBench
and on-line in the README file. If you do not agree to the licensing
information, remove NetBench entirely from your system and return it
and all accompanying materials (including any documentation) to ZDBOp
at the following address:

The Ziff-Davis Benchmark Operation
One Copley Parkway, Suite 510
Morrisville, NC 27560

2. System Requirements
The minimum configuration required to run NetBench is one computer
which acts as a file server and another computer which runs the
NetBench test station software. This test station accesses the
file server through a logical drive.


NetBench Diskette Contents

ABOUT TXT Information about NetBench
BATCH TXT Sample Batch command file
LICENSE TXT License text agreed to by user (also located in
Document)
LICFILE TXT Created by License procedure
NETBENCH EXE Actual NetBench executable file
PARMS CFG Used in Batch example
PARMS1 CFG Used in Batch example
README.TXT Readme file referred to within NetBench
USERINFO TXT Licensed to text (user and organization) Created
after the first time a program is licensed to a user.
ZD_NB210 PCX Startup Graphics Screen
ZDLPARMS CFG Essential file required for NetBench to execute
NETBENCH TXT ASCII readable Text version of User Guide
NETBENCH RTF Rich Text Format - User Guide
NETBENCH DOC Word for Windows version 2.0a - User Guide



Table of Contents

Chapter 1: NetBench Overview 1
NetBench is a tool 1
NetBench acts like an application 2
Model of Network Application Layers 2
The results reflect the system configuration 3
Using NetBench 4

Chapter 2: Installing NetBench 7
Installing on a PC 7
Installing NetBench on a Network Server 9
What's Next? 11
Example 2A (below). 11
An example of the contents of Autoexec.bat : 11
An example of the contents of config.sys: 12
An example of WS.ID contained in the root
directory of C:\ 12
STARTUP.NCF 12
AUTOEXEC.NCF 12

Chapter 3: Running NetBench 13
NetBench Displays 13
Ziff-Davis' NetBench Screensaver 14
Waiting Display 15
Activating the Test Station Coordinator 16
Basic Navigation - Your choice: Mouse or Keypress 17
Navigation Keys 17
NetBench Menus and sub-menus 18
File Menu 19
Rules for Batch files 20
Valid Syntax 20
Valid Batch Examples 21
Performance Menu 23
Options Menu 24
Set Name 24
General 24
NIC Options 26
I/O 27

Chapter 4: Network Interface Card (NIC) Throughput Test 31
Factors that influence the test 32
Parameters 32
Special Rules 33
Guidelines 33
Testing the server NIC 33
Testing the test station NIC 34
Setting-up and Running the Test 34
Evaluation Scenarios 36

Chapter 5: I/O Throughput Test 39
Mechanics 39
Objective 39
Factors that influence the test 41
Parameters 41
Guideline 43
Setting-up and Running the Test 43
Evaluation Scenarios 45

Appendix A: Data File and Statistics File Parameters 47

Appendix B: Temporary Files 51

Appendix C: ZDLPARMS.CFG 53

Appendix D: Data File Output Example 55

Appendix E: Statistics File Output Example 67

Appendix F: Technical Support 75
Contacting ZDBOp 75

Benchmark Request Form

Glossary


Chapter 1
NetBench Overview

This guide describes what NetBench is, how it is installed, the basics
of setting up and running each of the NetBench tests, and some scenarios
of how each NetBench test has been used during pre-release testing to
measure network file server performance. It includes some cautions and
suggestions, but it cannot anticipate all possible uses (proper or improper)
for NetBench. You are responsible for understanding the objective of
your test and how you use NetBench.

NetBench is a tool

NetBench is a tool that helps determine relative file server performance. In
order to do this, two distinct tests were developed: Network Interface Card,
and I/O Throughput. The purpose of each test is to isolate, exercise, and
measure throughput over time at the respective system location: the network
interface card, or the file server disk I/O subsystem.

These tests are influenced by the operating system used by the network,
memory, memory cache, CPU types and speeds, server disk speeds, and
Network Interface Cards -- the same items that influence the performance of a
user application on a network.

Although the fundamental operation of a NetBench test does not change,
modifications to test parameters such as test duration and file size, as
well as modifications to the network system under examination produce results
which can be comparatively analyzed and evaluated. Proper interpretation
of the results produced allow the user to determine what type of network
configuration is most effective for a simulated network load.

NetBench acts like an application

NetBench was designed to "act like an application" (see Figure 1) and is not
reliant on any specific network operating system. In other words, "special
system calls" were avoided so that the specific network operating system used
would be transparent to the testing process. The goal of the software is
to test File Services (the capacity of the file server for file I/O activity),
and to test the capacity of the network through the network interface card.

The tests allow the user to specify one of two locations in the system to
generate a simulated "bottleneck" of activity. This helps the user determine
the network configuration best suited for a particular application environment.
The term "bottleneck" is deliberately chosen. It represents the maximum
number of times an "activity" is performed in a user-specified time frame.
The maximum number of "activities" performed depends on the specific hardware
chosen for the network and is also influenced by the number of test stations
configured for the test.

Finally, it is important to keep in mind that because each test station
performs the maximum number of test iterations possible during the test
period, an actual application load of the same network configuration would
likely perform better. NetBench tests worst-case scenarios. Typical
applications on a production network tend to be less resource demanding
than the "bottleneck" tests performed under NetBench.

Model of Network Application Layers

The following provides you with an example of the different layers that
are in a network application.

A. Application Layer running on a test station

NetBench viewpoint
--------------------------------------------------

B. Network Redirector (redirected drive)

C. Network Adapter Driver (protocol e.g. IPX, IP, DECNET)

D. Network Interface Card (Workstation adapter card)

E. Wire

F. Network Interface Card (Network server adapter card )

G. Server


Figure 1. Representation of a Simulated Network Application

In this example, everything that appears below the dashed line (i.e., B
through G) represents information hidden from NetBench.

The results reflect the system configuration

The results are representative of the specific configuration used (hardware,
software, and NetBench parameters). The results may be compared properly
when one variable at a time is modified in the configuration. Careful thought
must be taken when choosing the variable to change. Some modifications may
have little or no effect on the network. Understanding which items make sense
to vary really depends on how well you (the NetBench user) understand the
network configuration that you are using.

Using NetBench

The following sequence is suggested to help you get started using NetBench:

1) Install NetBench.
In Chapter 2, the section, Installing NetBench on a PC, provides
instructions for installing NetBench on a single-user PC system should you
desire to become familiar with the tool before installing it on a network.
Network testing cannot be accomplished using the PC installation
procedure.

The section, Installing NetBench on a Network Server, provides
instructions for installing NetBench on a server drive for actual network
testing.

The Installing NetBench on a PC, section should not harm a network as
long as NetBench is installed on the local C: drive, whereas the network
installation procedure will seriously impair the network (and compromise
its functioning) if the network were active with user tasks. At the same
time, the values returned as a result of the test will be meaningless.

2) Decide on a test.
Determine which NetBench test to run: NIC Throughput or I/O Throughput.

3) Read about the test.
Read the corresponding section in this guide about the test.

4) Configure test network.
Make any test-specific adjustments to the test network.

5) Understand the NetBench menus.
Use the Running NetBench section to learn about using the menus to
configure NetBench parameters for the test.

6) Set any NetBench parameters.
Use the menus to set the NetBench parameters.

7) Run the test.


Chapter 2
Installing NetBench

This chapter talks about how you can install and start using NetBench. It
includes information on installing NetBench on a PC and on a network server.

Installing on a PC

This section provides instructions for installing NetBench on a PC in order
become familiar with the tool. Network testing cannot be accomplished using
the PC installation procedure.

Load the NetBench files in an empty directory on the PC. To do this:

1. Create the directory on the PC. Any directory name will work. The drive
must be local with respect to the PC. In this case we will assume the C:
drive.

md fakenet

In this example we will use FAKENET.

NOTE: This directory must be empty and should only contain files
which are pertinent to Ziff-Davis' NetBench. Files located in the
directory will be overwritten if they correspond to filenames
reserved by NetBench.

2. Insert Ziff-Davis' NetBench disk into the floppy drive and copy all
NetBench files to the newly created file directory.

3. Run the NetBench License Utility.

NETBENCH / L

You are required to provide your Name and Organization under the terms
of the Licensing Agreement. Follow the directions on the display.

4. Create a file named WS.ID with any text editor and type a unique 3 digit
numeric identifier (example: 001) at the first position in the file
(no lead spaces or line feeds), then save the file to the C:\WS.ID
local drive root directory (Refer to: Definitions WS.ID and Test
Stations). The purpose of the file is to identify the test station
with a unique 3 digit numeric identifier. NetBench requires this
file be located on the C:\ drive root directory of each test station.

5. Once NetBench is installed it is best to:
A) Determine which NetBench test to run (NIC Throughput or I/O
Throughput)
B) Read the corresponding section in this guide about the test.

C) If you were actually using the network for testing, you would normally
make specific adjustments to the test network at this point in time.
Because you have installed NetBench on your local drive, NetBench
results are "simulated" and no network adjustments are required.
D) Go to the Running NetBench section to learn how to use the menus
and configure NetBench parameters for the test. It would be useful to
"play" with the test parameters in order to become familiar with them.
E) Configure the NetBench parameters
F) Run the test.

Installing NetBench on a Network Server

WARNING
The user is cautioned against running these tests on a "live" production
network. NetBench is designed to exercise the network in various ways.
Running NetBench should be done in isolation when other user activity on the
network is non-existent.

Load all the NetBench diskette files onto the server such that the program can
run off a common logical drive with the other test stations on the network.
To do this :

1. Log into the server from your workstation over your LAN connection.

2. Create a default directory on the server. This directory will be
accessed by all the test stations as a common workspace from which
they will read and write. Any directory name will work. In this
example we will use NETBENCH. Warning: This directory must be empty
and should only contain files which are pertinent to Ziff-Davis' NetBench.
Other files may be overwritten if they are contained in this directory.

J: {substitute your Network drive name for J in all cases}
MD J:\NETBENCH
CD J:\NETBENCH

3. Install the NetBench programs. Insert Ziff-Davis' NetBench disk into the
floppy drive and copy the floppy files to the server directory you have
designated.

4. Run the NetBench License Utility.

CD \NETBENCH
NETBENCH / L

You are required to provide your Name and Organization under the terms
of the Licensing Agreement. Follow the directions on the display.

5. At each Test station: Create a file named WS.ID with any text editor and
type a unique 3 digit numeric identifier (example: 001) at the first
position in the file (no lead spaces or line feeds), then save the file
to the C:\WS.ID local drive root directory (Refer to: Definitions WS.ID
and Test Stations). The purpose of the file is to identify each test
station in the network with a unique 3 digit numeric identifier. A
sequential scheme is suggested for convenience (example: 001, 002, 003).
NetBench requires this file be located on the C:\ drive root directory
of each test station.

6. Set the login sequence on each test station to execute NetBench on
startup. Modify AUTOEXEC.BAT and CONFIG.SYS so that the test stations will
boot and automatically start up NetBench for the current test run. (See
Novell example Figure 2A and 2B.) Automatic boot up is not required, but
suggested for convenience. It also prevents other software interfering
with the performance tests.

7. Prevent any non-test stations from interfering with the network. Warning:
Do not run these tests from the same network that other users are accessing
since it will interfere with both the NetBench testing and the user. Any
non-test stations may remain physically connected to the network but must
be logged off the network server.

8. Ensure that the file server and network LAN is properly configured for the
test. This includes:

* LAN server network interface card.
* All connectivity and cabling.
* Software drivers on LAN server and test stations.
* Operating system parameters which might affect the test
(Disk caching, memory, maximum packet size, minimum packet
size).

NetBench is now installed.


What's Next?

Once NetBench is installed it is best to:

* Determine which NetBench test you want to run: NIC Throughput or
I/O Throughput.
* Read the corresponding section in this guide about the test.
* Make any test specific adjustments to the test network.
* Go to the Running NetBench section to learn how to use the menus
and configure NetBench parameters for the test.
* Configure the NetBench parameters.
* Run the test.

Example 2A (below).

Example of Test Station files which use a Novell operating system server
(Note: The following is only an example. The user must configure the test
stations to match the LAN server used.)

An example of the contents of AUTOEXEC.BAT :

@ECHO OFF
PROMPT $p$g
PATH C:\DOS;C:\;
set TEMP=C:\DOS
ipx Novell
net5 Novell
e: logon sequence to file server
login supervisor logon username to file server
g: common logical drive
cd \NetBench\ common logical directory
NetBench start up NetBench

An example of the contents of CONFIG.SYS:

DEVICE=C:\DOS\SETVER.EXE
DEVICE=C:\DOS\HIMEM.SYS
DOS=HIGH
FILES=30
lastdrive=d

An example of WS.ID contained in the root directory of C:\

12345678901 column position in file
001 GROUPA1 file contents

NOTE: The 3 digit number of the WS.ID field must always be unique on each
test station.

Figure 2B (below). Example of matching Novell server configuration files

STARTUP.NCF
set minimum packet receive buffers=200
set maximum physical receive packet size=4096

AUTOEXEC.NCF
set enable disk read after write verify=OFF
set maximum packet receive buffers=2000
set maximum concurrent disk cache writes=100
set immediate purge of deleted files=ON
set new service process wait time=0.3
set maximum service processes=40


Chapter 3
Running NetBench

Now that NetBench is installed and the licensing procedure has been executed,
it is time to begin interacting with NetBench. To run NetBench, enter:

NETBENCH

If NetBench does not start up:

1. Check to see if the default directory for NetBench is present.

2. Verify that NetBench has been correctly installed. The ZDLPARMS.CFG
file is essential for NetBench to operate. See the ZDLPARMS.CFG
Appendix.

NetBench Displays

NetBench will always start up in the following sequence:
Display 1 ZIFF-DAVIS Logo
Pause

Display 2 Licensing display with Copyright and your ID
Pause

Display 3 "Waiting" display

The "Waiting" display is the default display which will appear on the test
station after the NetBench command is issued. All the test stations in the
waiting mode are waiting for the coordinator which will trigger testing within
its current directory path.

One test station must be designated as the Test Station Coordinator. The test
station coordinator becomes an interactive test station and no longer has the
message "Waiting." Any one test station can become the test coordinator
station.

The Test Station Coordinator also acts as a test station for the duration
of the test and reverts to the Test Station Coordinator again when the
test duration has elapsed.

Ziff-Davis' NetBench Screensaver

This is an optional screen saver "time out" feature which is activated by the
Options, General submenu. It "floats" around on the display to prevent "burn-
in" of the display phosphor. The default display will reappear by pressing any
key.

Waiting Display

ZD Labs NetBench(TM) .

File Performance Options Help


Waiting
Waiting for test identification from Coordinator station



[ OK ]



ID:ddd Test No Test Waiting...
Group:ooooooo Station Selected
========== ===========
Interactive Idle

(Words indicated below the === bar indicate status changes
which are reflected in the window immediately above.)


Activating the Test Station Coordinator

Activating the OK box in the waiting window causes the test station to become
"interactive" and available to become the Test Station Coordinator.

The OK box is selected by pressing the space bar or enter key, or by clicking
the OK button with the mouse. Because NetBench is performing activity on the
local PC in waiting mode, it is conceivable for the test station to "miss" a
mouse click. Repeating the mouse click or using the enter key or space bar
will usually assure success. Once the OK button has been activated the
"Waiting" window disappears. The word "Interactive" appears in place
of "Test Station".

It is now possible to select from a menu. To revert back to waiting mode,
select the Performance Menu and choose "Wait Mode". While two or more test
stations can be in interactive mode, only one test station acting as the Test
Station Coordinator may update items within any submenus, or run a test under
the Performance Menu. Because there is no practical reason to have multiple
test stations in "interactive" mode, Ziff-Davis strongly advises against
doing so.

Test stations in "interactive" mode cannot be selected as actual test stations
except the one actually running the test as the Test Station Coordinator. If a
second interactive test station attempts to activate another test
concurrently, an error will occur.

Basic Navigation - Your choice: Mouse or Keypress

Navigation is accomplished by using a mouse. If a mouse is not available or
not configured properly, alternative keys may be used. Use the ALT key
combined with the LETTER underlined within the menu or list.

If a menu is accidentally selected, the ESC key is pressed to deselect the
menu. Items may be entered in dialog boxes by typing the requested
information. The TAB or SHIFT+TAB or ARROW keys are used to navigate
within dialog boxes. The ENTER key commits the action unless otherwise
specified.

Navigation Keys

Mouse - Point and click select a menu

Mouse - Point and click activate a choice

alt + letter select a menu or choice

Esc deselect a sub-menu

Tab forward to field

Shift+Tab backward to field.

Enter commit choice

Arrow keys Left, Right highlight a menu item

Arrow keys Up, Down highlight a submenu item


NetBench Menus and sub-menus

File Performance Options Help

____________ ____________ _____________ ______
Data File... NIC Test... Set Name... ReadMe
_____________
Batch File... IO Test... General Test... About

Stats File... NIC Options...

_____________ _____________ IO Options...

Batch Run... Wait Mode

_____________ _____________

Statistics Stop Loads

_____________

Quit


Status Message
Waiting for test identification from Coordinator station



Mode Test name or Error or
operation status

ID:ddd Group:ooooooo Test No test Waiting....
Station selected
========== ============== ============
Interactive NIC Throughput Polling
Test
Batch Cmd X I/O Throughput Running
Test
Test Setup Making Test File
Done
Aborted
Idle

(Words indicated below the === bar indicate status changes
which are reflected in the window immediately above.)


File Menu

The file menu allows changes to file names referenced by NetBench. It is
also used to invoke the batch run file, and to compute and display
statistics captured in the statistics file. Finally, it allows a test
station in an interactive mode (test coordinator) to exit the NetBench
program back to the DOS operating system.

If Data file, Batch File, or Stats File is selected, the user is asked
to supply the filename(s) used while running NetBench. Each of the
filename parameters has a 12 character limit.

The Data file is the data collector file name referenced by NetBench. The
data file contains a detailed entry by station identification number (WS.ID)
for each test station successfully answering the poll at the conclusion of
the test. Each detail line represents the actual "raw" data collected from
the test station. It also retains a record of the parameters used during
the test. The Data file successively appends each test record to the
previous test record. Therefore, consider purging these files periodically.
At Ziff-Davis, file naming conventions are used to group tests together.

The Stats File (Statistics File) contains a summary of the test. It
contains a subset of the configuration data from the Data file. The Stats
file also contains a total throughput number that is the sum total of all
the test stations reporting at the end of a test run. Every time the
Statistics command is issued, the Stats file is regenerated from the
current Data file and the total is displayed.

The Batch file is used to automate a series of NetBench tests. The batch
file will generally take advantage of the group identifier code in the
WS.ID file. This section should be skipped until the user becomes familiar
with using NetBench.

Rules for Batch files

The Batch file consists of any number of test statements each terminated by
a semi-colon. The file is terminated by a statement consisting of a single
exclamation mark. Each statement should begin on a new line. A statement
consists of the following items which can be preceded and followed by one
or more spaces, tabs or new line characters:

Valid Syntax
1) 3 letter test code: NIC or IOT (these must be upper case).

2) Parameter file name (any valid DOS filename except ZDLPARMS.CFG,
case insensitive). This parameter file must follow the rules found in
Appendix: ZDLPARMS.CFG.

3) One or more group IDs, each of which are no more than 7 characters in
length. Group Identifiers correspond to those specified in each individual
test station WS.ID file. This field is converted into a DOS file name
and is therefore case insensitive.

Valid Batch Examples

Example: Single Batch Command

Command Activity
NIC parms.cfg group1; 1. Perform the NIC test, using
PARMS.CFG file, activated for those
workstations with common group
identification of GROUP1 in their
WS.ID file

! End of batch file indicator

Example: Multiple Batch Commands in a Single File

Command Activity
NIC parms.cfg group1; 1. Perform the NIC test
2. using PARMS.CFG file
3. activated for those workstations with
common group identification of GROUP1 in
their WS.ID file

IOT test1.prm sector5 sector6 1. Perform the IOT test
sector9 ; 2. using a different parameter file called
TEST1.PRM
3. using group id's sector 5, sector 6
and sector 9.

! End of batch file indicator


Example: Occurrence of Legal Leading and Trailing Spaces Batch File

Command Activity
^NIC Perform the NIC test
^whatever.xxx using the file WHATEVER.XXX for
parameters
^123 Activated for group 123
^anygrp^grp2 group anygrp
^onemore group grp2, group onemore
^; end of NetBench statement
^! end of batch file indicator.

Note: ^ = one or more spaces

The Statistics command must be executed in order to produce the statistics
file. Please refer to the comments above (Stats File).

The Quit command is executed in order to properly exit NetBench. This
command is generally used from the Test Coordinator Station. Stop Loads is a
related "exiting" command used from the Test Coordinator Station and is found
on the Performance menu. The Quit command exits the test station on which
the command is activated while the Stop Loads command exits all test stations
that are in the "Waiting" mode. For details please refer to the Stop Loads
command.


Performance Menu

The performance menu contains commands that begin NetBench tests and
commands that perform administrative functions.

* Selecting NIC Test causes the network interface card test to begin.
* Selecting I/O Test causes the I/O throughput test to begin.

The parameters associated with each test are supplied from the Options Menu.
All test stations perform the same test and each test station displays its
individual results on its own monitor. The Test Station Coordinator gathers
the data, and saves all results in the Data file.

Wait Mode changes a test station from interactive to "waiting". Waiting mode
signifies that the test station waits for commands directed to it from the Test
Station Coordinator. The command is useful when a test station has
accidentally been placed into "interactive" mode. There can only be one Test
Station Coordinator for any test on the same network.

Stop Loads forces all the test stations to exit the NetBench application and
return to the DOS prompt. This option is normally selected when all testing is
completed.

Use of this command will require the person performing the tests to invoke
NetBench on each test station again to resume testing.

Once the Stop Loads choice is selected there is no automatic way to resume
testing. To continue testing after performing this option requires either
rebooting (if the autoexec.bat automatically starts up NetBench), or manually
executing NetBench (which starts up in waiting mode).


Options Menu

The Options menu allows the user to set certain parameters, such as file size
and test duration, that influence the test. When a test is completed, these
variable parameters are written out to the Data file. The primary categories
for this menu are Set Name, General, NIC Options, and I/O.

Set Name

Set Name allows the user to enter a User Name and a LAN Name. These are
treated as comments in the Data file and Stat file (each has a 25 character
length limit).

General

General Test Options allow the user to set Test Length, Variance, Blanking
Interval, and Common Path.

Test Length (1 - 9999 minutes)

Test length, in units of minutes, is the actual duration of the test. This
number determines the duration of the test, after any preliminary
initialization sequence. Initialization time is not counted as part of this
number, nor is polling time counted as part of the test time.

Caution: Network tests should be run sufficiently long to
reach an ordered and productive behavior pattern
(particularly if a large number of work stations are present).
If the test is run for an insufficient time, the results may be
meaningless. Ziff-Davis usually sets this parameter at 10.

Variance (1 - 99 seconds)

Variance is a user-specified estimate of time (in seconds) between the first
and the last work station completing the test and reporting results. The more
test stations that are on the network performing the test, the greater the
variance time needed to insure collection of results.

Ziff-Davis recommends that this number be set to a minimum of 20 seconds,
because most systems will have no more than a 20 second variance when
writing results files.

If the number of test stations reported in the Data file or Stat file is
less than expected, increase the variance number. It may also be necessary
to increase this number if the following message is received: "Warning:
Not all Load Stations statistics gathered".

Blanking Interval (1 - 99 minutes)

Blanking interval sets the duration, in minutes, before the screensaver will
activate. When the screensaver activates, the display will blank and the Ziff-
Davis logo will float around the display. Any key aborts the screensaver and
resumes the default NetBench display.

Common Path

The Common Path parameter specifies the directory that NetBench uses to
create temporary files and other temporary test resources. Ziff-Davis
recommends that this parameter always be set to ".".

NIC Options

NIC (Network Interface Card) Options relate specifically to the NIC throughput
test. The only available option for this test is the NIC block size. The
purpose of this option is to permit a parameter setting of "usable"
characters that can be sent through the communication path.

NIC Block Size (1 - 65,532 bytes)

This number is the size of the blocks in bytes that will be transferred during
the NIC Throughput test.

Because all test stations are reading from the same file (NETBENCH.EXE), the
entire test will run from the cache server subsystem. The test stations will
not access the disk and the bottleneck is moved to the NICs. Note: If a
network monitor were used to isolate the message sizes sent, the actual
packet size would be higher due to the overhead required for the packet
routing.

The tester needs to be aware of the physical minimum and maximum sizes of
usable bytes for the network protocol chosen.

Rules for selecting the proper number: For maximum line utilization, this
number should be set to the maximum packet-frame size (subtracting the
encapsulating header and trailer portion) which your LAN can handle.
(Example: 1460 to 1500 for Ethernet; Ethernet maximum including overhead
is 1512) For the maximum frames per second, set this number to 1.
Note: In order to run this test correctly the parameter entered here must
be less than the cache size of the server.

I/O

I/O Options relate to the Input/Output Throughput test. They exist to help
approximate network activity according to a desired model. Four buttons and
three parameters permit creation of the desired I/O operation and seek type
(random or sequential), I/O file size, I/O block size, and a ratio of reads to
writes.

Test type has two buttons: Random or Sequential

Read / Write has two buttons: Read only, Do Writes.

There are three numeric parameters allowable: R/W Ratio, I/O File Size, and
I/O Block Size.

Valid combinations for these buttons and parameters are represented in the
following table:

Valid I/O Read only Do Writes
Throughput test button button
options
Sequential Sequential reads Sequential writes
button only only

R/W Ratio is not R/W Ratio is not
used used

Random Random reads only Random reads and
button writes according
R/W Ratio is not to the R/W Ratio
used

R/W Ratio Parameter Parameter
parameter not used enterable and
active with Random
and Do Writes

0 = write only
1 = 1 read for 1
write
10 = 10 reads to 1
write

I/O File Size always required always required
parameter

I/O Block Size always required always required
parameter


R/W Ratio (0 - 99 valid range)

The Read/Write ratio number determines the number of reads for every write
performed during the I/O Throughput test. Setting this at 0 performs writes
only. Example: a ratio of 10 indicates that for each write performed on the
server there will be 10 reads. Correspondingly a ratio of 1 indicates that for
every write performed there will be 1 read. The parameter is only valid for
the button combination of Random, and Do Writes. It is ignored for the
other tests.

I/O File Size (Kb) (1 - 2,097,152 K bytes)

The I/O File Size field contains the size of a test file created by each test
station during the initialization sequence of the I/O Throughput test. This
file is not reused; it is recreated each time the test is run. A larger number
will increase the disk surface area on which the test operates, placing the
server disk subsystem under more stress. Allocating a large file size is
desirable as it reduces the impact of caching on test performance. If a large
number of test stations are present and disk space on the file server is low,
the file size may need to be reduced if the test file creation fails.

I/O Block Size (Bytes) (1 - 65532 bytes)

The I/O block size field contains the size (in bytes) of the block to be
read or written by the test station during the I/O Throughput test. This
block should be significantly smaller than the test file to allow for
random I/O or due to caching, the results may be meaningless.

Help

The HELP option permits access to a ReadMe ASCII file and an About
statement. The ReadMe file contains the Menu and Sub-menu portion of the
NetBench User Guide. The About statement contains the product name,
version, program developers, and the Ziff-Davis staff at the time of release.


Chapter 4
Network Interface Card (NIC) Throughput Test

This chapter describes how to use the Network Interface Card (NIC)
Throughput Test.

Mechanics

During this test, all NetBench test stations perform sequential reads from the
NETBENCH.EXE file of the size specified by NIC Throughput Block Size.
Because all stations are reading from the same file, the entire test will
run from the server cache subsystem. Because NETBENCH.EXE is in the server
cache, the test stations will not access the disk and the maximum data
throughput will be limited by the throughput capability of the network
interface cards. The test stations count the number of blocks read from the
server during the test and display the total throughput (number of
iterations times the block size in bytes) when the test is over.

Objective

The objective of this test is to exercise network activity from the NIC test
station through the NIC on the file server.

This test result indicates the maximum amount of network traffic which can
occur within a specific amount of time using the test stations supplied at a
specific block size. (Refer to the Running NetBench section, Parameters,
Testing the server NIC and Testing the test station NIC).

The NIC Throughput test results are in bits per second

Factors that influence the test
* NIC card contained in the server and associated driver software.
* NIC card contained in the test station and associated driver software.
* Cache on NIC card.
* Theoretical Maximum network speed (Ethernet 10 Mbits, Token Ring 4
Mbits, 16 Mbits, FDDI, etc.).
* Cable type - bandwidth.
* CPU in the server.
* CPU in the test station.
* Memory in server.
* Memory in test station.

Parameters

NIC Throughput Block Size. - The user should vary the block size to simulate
the network application for small and large block sizes for successive
test runs. Workstation drivers usually have a minimum block size. The
parameter selected for NetBench should never drop below this minimum
because network activity must be forced to occur on the cable used.
Ridiculous results may be an indicator of "out of bounds" conditions,
for example: 15 Mbits / sec when the maximum for Ethernet is 10 Mbits /sec.

Special Rules

1. This test is network operating system (NOS) dependent, in that one NOS
cannot properly be compared with another NOS and also vary the NIC at
the same time.

2. Do not turn cache memory off on the server.

3. Network configuration files in the test station must remain constant for a
proper comparison of NIC cards.

4. Different software driver versions from the same manufacturer may affect
the results. Be certain the item being measured is the correct one. Vendor
cards may also have version differences.

Guidelines

Determine which type of card to test. There are often two kinds of NIC
adapters: server cards and workstation cards. The NIC installed in the server
must have as high or higher performance capability than the test station NIC
for the test to be informative.

Testing the server NIC

If the test is designed to test the server network interface card throughput,
replace only that card in the server and rerun the test with each new
server card to be tested, not varying the NIC adapters in the test stations.
This type of test would generally require a server and multiple test
stations.

Example: For testing an Ethernet Novell network, 1 server and 4 test
stations are probably required. If each test station had the capacity to
generate 3 Mbits of information the network would be saturated, because 4
stations times 3 Mbits = 12 Mbits and the capacity of the Ethernet is only
10 Mbits.

Testing the test station NIC

If your test is designed to evaluate the test station NIC, only one server
and one test station is required. The highest performance NIC card would
still be placed in the server for the test. However the various cards
that were being evaluated would be placed in the test station, varying one
at a time for each test with all other factors remaining the same.

Setting up and Running the Test

1. Configure the physical network, cable connectivity and server connectivity.
Ensure that the network is in operation, proper software network
parameters have been established, NetBench has been installed on the
server common drive from a workstation and licensed, and that all test
stations have been logged on the server and have changed their current
directory to the common drive on the server. Each test station should
start up NetBench using this common drive and directory. The test stations
must not reference NetBench from any other location (such as a copy on a
local disk).

2. Select one test station to act as the Test Station Coordinator.

3. Set the File Menu parameters (File Menu - Data File, Stats File).

4. Set the Options (Options - Set Name, General Test, NIC options)

5. Run the test (Performance, NIC test)

6. The display on each test station should show its individual results. This
will also be true of the Test Station Coordinator.

7. Collect the results (File menu - Statistics). A summary will display on the
Test Station Coordinator. The total number of test stations responding to
the test will be shown in the Final Statistics window.

8. Exit NetBench (File - Quit). (Or run the next test.)

9. Examine the detail results in the Data File using a text editor or from a
printout. There should be an entry in this file for each test station
which reported back results to the Test Station Coordinator during the
polling sequence. The entry is listed by the station WS.ID number.

If an entry is missing, there are three possible causes.

Cause A The missing test station was not setup correctly pointing to the
common directory on the file server.

Cause B The WS.ID file is incorrect or has the same number on two or
more of the test stations. These must be unique numbers for each test
station on the network and must be on the individual C: root directory for
each test station.

Cause C The Variance setting under Options Menu, General Test was too
short for all the test stations to report back in sufficient time to the
Test Station Coordinator. Increase this value. However, if there are
fewer than 10 test stations on the network and the variance setting is
greater than 30 seconds, re-examine Cause A or B, and the network
connectivity. Thirty seconds should be plenty of time for 10 stations
to respond unless something is wrong with the network itself.

10. Examine the totals for all test stations in the Statistics File.


Evaluation Scenarios

Example 1
The test bed consisted of one test station connected to a Novell Netware
server. Vendor XYZ 486/33M systems with 12MB of RAM was used for the server
and Vendor XYZ 386/25 with 8MB of RAM was used for the test station. The
server was loaded with Novell's Netware 3.11 network operating system. The
test station was a DOS client loaded with ABC drivers.

A QRX card was installed in the server and remained in place throughout this
test. Each card tested was placed in the workstation in turn. Using
Ziff-Davis' NetBench NIC test, the performance of the card was measured at
different block sizes (512, 1K, 2K bytes). The test was performed for
the VendorCardA, VendorCardB, VendorCardC, and VendorCardD cards. The
duration of each test was 10 minutes.

Test 1 (Netware Server)
Vendor Name 512B 1KB 2KB
VendorCard A 2331989 3402410 3899939
VendorCard B 1889621 2747050 3064162
VendorCard C 2009770 2861738 3200341
VendorCard D 2012501 2824263 3145728


Example 2
The test bed consisted of 3 test stations and one LANManager server.
Ziff-Davis used Computer 486/33M systems with 12MB of RAM for the server
and 486/33M with 8MB of RAM for the workstations. The server was loaded
with Microsoft's LANManager 2.1 with OS/2 1.3 as a primary operating
system. The test stations used DOS.

A vendor ACF card was installed in each of the three client test stations
and in the server. Using Ziff-Davis’ NetBench NIC test, the total
throughput was measured at a 4Kbyte block size.

The same procedure was repeated by replacing the ACF card in the server with
the FGH card. The duration of each test was 10 minutes.

VendorName Throughput(bits per sec)
FGH 9209874
ACF (server) 8312173
ACF (client) 8194716


Chapter 5
I/O Throughput Test

This chapter explains how to use the I/O Throughput Test.

Mechanics

When this test is selected, each test station writes a test file to the common
server drive. Once the Test Station Coordinator determines that all test
stations have completed this initialization operation, I/O throughput testing
begins. During the test, each test station performs the file reads and/or
writes (at the block size, for the length of time specified, according to
the parameters set under the I/O Options menu). After the test duration
has elapsed, each test station reports its results.

Objective

The objective of this test is to determine the performance of a file server
relative to a baseline system. Performance in this case is defined in terms
of two concepts - peak I/O throughput, and scalability. Peak I/O throughput
is the maximum amount of user data (block size and file size) which can be
read or written to the server during the test. Scalability is the ability of
a server to sustain peak I/O throughput (disk and disk cache) as the file
service load on the server is increased during the test.

File server performance should be measured across the test station to the
server disk path. To use this test effectively, ensure that the system is
configured so that the desired components are stressed and that other
extraneous components are eliminated with respect to their influence in such
a way that they do not hamper performance. Components, such as the number
of clients and the network wire must be set up in such a way as not to
cause bottlenecks.

For example, to prevent the NIC from becoming the bottleneck, multiple NICs
would be placed in the server, each corresponding to a separate LAN segment

(usually four NICs are sufficient). Multiple test stations and a server NIC
would be attached to each segment so that the server becomes a common link to
all LAN segments. Additional test stations could be added to the segments and
the test re-run to identify the point at which server I/O Throughput begins to
be maximized. Also, by using higher bandwidth network services, such as 16Mbit
Token Ring, the network bottleneck is reduced and places more emphasis on
file service performance. This will cause the disk subsystem to be stressed
rather than the LAN NICs.

To use NetBench properly, a throughput curve should be plotted for comparing
performance measurement. A throughput curve is formed by combining the
results (datapoints) from multiple NetBench iterations into a single dataset.

The dataset is formed by configuring a baseline network, setting the I/O
Options parameters, running the I/O throughput test, varying one load factor
(for example, increasing test stations) and running the test again, varying the
same load factor and running the test again, etc.

A throughput curve, consisting of charting the collection of data points, not
only displays the peak throughput of a server, but also demonstrates the
servers scalability and how the server will perform from light to harsh
conditions.

By comparing one single data point to multiple points (the throughput curve),
it becomes clear that a single point measurement can be very misleading. For
example, experiments with this test have shown that throughput curves of
servers being compared often cross. A server which displays superior
performance at low test station loads may display inferior performance as the
load is increased. This information is very important to a complete analysis
of performance.

Throughput curves for file servers generally fall into three categories. The
most common is the Poor Scalability category, typical of a desktop machine
turned into a file server. This curve may display an acceptable throughput
at a light test station load, but the performance quickly drops off as the
load is increased. The next category is that of Median Scalabilty. This
curve has a very high peak throughput but as soon as the client load rises
above the ability of the cache, the performance drops rapidly and bottoms
out at a low level. The last is the Superior Scalability category. This type
of curve has a very high peak throughput that is sustained far beyond the
other two categories.

I/O Throughput test results are in bits/ second.

Factors that influence the test
* Cache in the server.
* Cache in the test station.
* Number of test stations.
* All NetBench Option parameters for I/O Options.

Parameters

I/O File Size (in Kbytes)
One disk file is created for each test station prior to the commencement of the
test (of the size specified in I/O Throughput File Size). This file is
created on the server in the current working directory. Increasing this
number creates a greater amount of relative disk head movement from the
beginning of the file to the end of the file.

I/O Block Size (in Bytes)
Each test station testfile is read or written a block at a time according to
the I/O Throughput Block Size parameter. Increasing this number decreases
the amount of disk activity for the given test (if other factors are held
constant).

Read or Write and Random vs Sequential are specified according to radio
buttons selected in the Options, I/O Options submenu.

There are five read/write options for this test. These are Sequential Read
only, Sequential Write only, Random Read only, Random Write only*, and Random
Read/Write*.

If Sequential is selected, the only testing performed is Sequential Read or
Sequential Write according to the respective button selected: Read Only or Do
Writes. Sequential Read and Write is NOT performed.

If Random is selected with the Read Only button the only testing performed are
Random Reads.

If Random is selected with the Do Writes button, then Random Reads and
Random Writes are permitted according to the number specified in R/W Ratio
field.

I/O Options Definitions

Sequential means that each test file created on the server by the test
station is read or written one block at a time according to the Read or
Write operation selected. When the end of the file is reached, the test
sequence starts over and continues until the test duration limit has elapsed.

Random means that each test file created on the server by each test station is
read or written one block at a time according to the Read, Write, Read/Write
operation selected. The relative "start point" of the read or write is
randomly determined by each test station. This process repeats until the
test duration has elapsed. An attempt to read or write beyond the end of the
file size allocated is not permitted.

* When both the Random button and Do Writes button are selected, the mix of
read/write operations are determined by the read/write ratio field. The
read/write ratio field represents the number of read operations for every write
operation. Example: If the ratio is 3, there are 3 reads for every 1 write.
If the ratio is 0 there is only random writing. If the ratio is 10, there
are 10 reads for 1 write.

Guideline

Generally use multiple NICs in the server with multiple LAN segments in order
to cause a bottleneck at the disk. The number of LAN segments may also be
varied.

Setting up and Running the Test

1. Configure the physical network, cable connectivity and server connectivity.
Ensure that the network is in operation, proper software network
parameters have been established, NetBench has been installed on the
server common drive from a workstation and licensed, and that all test
stations have been logged on the server and have changed their current
directory to the common drive on the server. Each test station should
start up NetBench using this common drive and directory. The test stations
must not reference NetBench from any other location (such as a copy on a
local disk).

2. Select one test station to act as the Test Station Coordinator.

3. Set the File Menu parameters (File Menu - Data File, Stats File).

4. Set the Options (Options - Set Name, General Test, IO options).

5. Run the test (Performance, IO test).

6. The display on each test station should show its individual results. This
will also be true of the Test Station Coordinator.

7. Collect the results (File menu - Statistics). A summary will display on the
Test Station Coordinator. The total number of test stations responding to
the test will be shown in the Final Statistics window.

8. Exit NetBench (File - Quit). (Or run the next test.).

9. Examine the detail results in the Data File using a text editor or from a
printout. There should be an entry in this file for each test station
which reported back results to the Test Station Coordinator during the
polling sequence. The entry is listed by the station WS.ID number.

If an entry is missing, there are three possible causes.

Cause A The missing test station was not setup correctly pointing to the
common directory on the file server.

Cause B The WS.ID file is incorrect or has the same number on two or
more of the test stations. These must be unique numbers for each test
station on the network and must be on the individual C: root directory for
each test station.

Cause C The Variance setting under Options Menu, General Test was too
short for all the test stations to report back in sufficient time to the
Test Station Coordinator. Increase this value. However, if there are
fewer than 10 test stations on the network and the variance setting is
greater than 30 seconds, re-examine Cause A or B, and the network
connectivity. Thirty seconds should be plenty of time for 10 stations
to respond unless something is wrong with the network itself.

10. Examine the totals for all test stations in the Statistics File.

Evaluation Scenarios

Example of evaluating servers at 1024 K bytes I/O throughput evenly
distributed on 4 NIC cards. Two NICs are Token Ring, Two use Ethernet.

# of Test Stations Server Vendor C Server Vendor D Server Vendor E
8 17317884 12937894 14231139
16 23567830 18288905 14905063
24 24918961 20149030 15439179
48 23426979 19689402 12576168
72 21369669 16986887 4200039
96 18114029 13708320 3880144


Example of evaluating servers with 96 test stations evenly distributed on 4
NICs. Two NICs are 16 Mbit Token Ring, Two NICs are Ethernet.

File size Kbytes Server Vendor C Server Vendor D Server Vendor E
512 23464845 19857675 6469524
1024 18114029 13708320 3880144
2048 14063785 10256700 2980629
4096 12039712 9155468 2635356
8192 10873353 7716024 2319071

These statistics were gathered using Novell 3.11 using the same amount of
RAM. Different CPU, and disks were used. Test Length was 10 minutes.


Appendix A
Data File and Statistics File Parameters

Parameters Common to Both Files
* @@@
* Date / Time Stamp.
* User name comment line.
* LAN Tested comment line

Test File Options, I/O (I/O throughput
Options, I/O File test) kbytes
Size
IO Blocks Options, I/O (I/O throughput
Options, I/O test) bytes
Blocks
Do Writes Options, I/O (I/O throughput
Options test) Y=Do
Writes, N=Read Only
Sequential Options, I/O (I/O throughput
Options test)
Y=Sequential
N=Random
R/W Ratio Options, I/O (I/O throughput
Options test)
NT Blocks Options, NIC (NIC Throughput
Options, NIC Test) bytes
Block Size
Test Length Options, General
Options
Variance Options, General
Options
Blank Int. Options, General Blanking Interval
Options
Common Path Options, General
Options
Data File File, Data
Batch File File, Batch
Comment Performance, (any
test selected)
Test Type Performance, (any
test selected)


NIC Throughput
...
{ Common parameters }...
@@@
012 3283690 0 123200 600
022 3283690 0 123200 600
132 3283690 0 123200 600
@@@

@@@ Indicates start or end of repeating data.

Each line represents results from each test station. The test station
fields are described from left to right.

* Workstation ID
* Test Station Throughput (bits/sec)
* Total Writes During the Test (Always 0)
* Total Reads During the Test
* Duration of the Test Station run (sec)

I/O Throughput
...
{ Common parameters } ...
@@@
010 143360 0 1050 60 0.00000 0.05714
011 143360 0 1050 60 0.00000 0.05714
012 143360 0 1050 60 0.00000 0.05714
@@@

@@@ Indicates start or end of repeating data.

Each line represents results from each test station. The test station
fields are described from left to right.

* Workstation ID
* Test Station I/O Throughput (bits/sec)
* Total Writes During the Test
* Total Reads During the Test
* Duration of the Test Station run (sec)
* Average Time Per Write Cycle(sec)*
* Average Time Per Read Cycle(sec)*

*Note that the value for these two items will always be 0.00000 when the
Random and Do Writes I/O Throughput test options are selected, and the Read
Write Ratio is greater than 0.

Stats File

This file contains the configuration information from each run accompanied by
an extra line that has the total system throughput for that test iteration.


Appendix B
Temporary Files

NetBench creates a number of temporary files for coordination between test
stations. The names ( or "wild card" syntax) are listed below. This is not a
comprehensive list. The user should avoid placing any file in this directory
that is important to retain.

IOTEST .SEM
NTTEST .SEM
CBTEST .SEM
NOTEST .SEM
??TEST .SEM
EXTEST .SEM
IOT_YES .SEM
IOT_NO .SEM
IOT_* .SEM
FILEOPT .SEM
EVALOPT .SEM
STATOPT .SEM
STS* .SEM
STF* .SEM
STB* .SEM
ST* .SEM
B* .SEM
YXZVQ???.X$X
IOT* .X$X
* .X$X


Appendix C
ZDLPARMS.CFG

Listing of ZDLPARMS.CFG (same format for any Parameter file)

Mike Bicuspid User Name
Lan A LAN Name
1024 I/O File Size Kbytes
512 I/O Block Size Bytes
1 1=Do Writes; 0=Read only
0 1=Sequential; 0=Random
3 R/W Ratio
1999 NIC Block size bytes
10 Test Length minutes
20 Variance in seconds
5 Blanking Interval minutes
. Current Directory (period)
data.dat Data File
batch.txt Batch File
stat.dat Stats File
NetBench(tm) Version 2.10


The ZDLPARMS.CFG file is essential for NetBench to operate. If this file is
accidentally deleted, NetBench cannot run. If the ZDLPARMS.CFG file is
missing, obtain it from the backup disk, or look for an alternative file
PARMS.CFG. Copy the ".cfg" file to ZDLPARMS.CFG. NetBench will then
startup properly.

If this file cannot be located, the file can be carefully created with an
editor. The New line character terminates each line. It is essential that
the new line character appears within the length limitation set forth in the
documentation. This file is in the same format required for the batch
parameter files listed in the batch file syntax.

For batch operations, it is quickest to copy the ZDLPARMS.CFG file to a new
file name and then carefully edit the parameters you wish to change.
Alternatively, set up your parameters within NetBench, then quit and copy
ZDLPARMS.CFG to a new file name. Repeat this procedure for each new
parameter file.


Appendix D
Data File Output Example

@@@
Sat Sep 26 16:52:14 1992
User Name : Mike Bicuspid
LAN Tested : Token Ring
Test File : 1024 kb
IO Blocks : 8192 b
Do Writes : Y
Sequential : N
R/W Ratio : 3
NT Blocks : 512 b
Test Length: 10 minutes
Variance : 20 seconds
Blank Int. : 15 minutes
Common Path: .
Data File : nb1912.dat
Batch File : batch.txt
Stats File : nb1912s.dat
Comment : NetBench test
Test Type : NIC Throughput
@@@
026 1611244 0 35600 181 0.00000 0.00000
027 380181 0 8400 181 0.00000 0.00000
028 537180 0 12000 183 0.00000 0.00000
029 504123 0 11200 182 0.00000 0.00000
030 452596 0 10000 181 0.00000 0.00000
036 1529173 0 33600 180 0.00000 0.00000
037 133565 0 6000 184 0.00000 0.00000
038 543116 0 12000 181 0.00000 0.00000
039 468114 0 10400 182 0.00000 0.00000
040 432105 0 9600 182 0.00000 0.00000
@@@
Sat Sep 26 16:57:21 1992
User Name : Mike Bicuspid
LAN Tested : Token Ring
Test File : 1024 kb
IO Blocks : 8192 b
Do Writes : Y
Sequential : N
R/W Ratio : 3
NT Blocks : 512 b
Test Length: 10 minutes
Variance : 20 seconds
Blank Int. : 15 minutes
Common Path: .
Data File : nb1912.dat
Batch File : batch.txt
Stats File : nb1912s.dat
Comment : NetBench test
Test Type : I/O Throughput
@@@
026 1194211 820 2460 180 0.2195 0.00000
027 728177 500 1500 180 0.3600 0.00000
028 833035 572 1716 180 0.3147 0.00000
029 766043 526 1578 180 0.3422 0.00000
030 798082 548 1644 180 0.3285 0.00000
036 1188386 816 2448 180 0.2206 0.00000
037 570633 394 1182 181 0.4594 0.00000
038 734003 504 1512 180 0.3571 0.00000
039 805260 556 1668 181 0.3255 0.00000
040 675748 464 1392 180 0.3879 0.00000
@@@
Sat Sep 26 17:01:41 1992
User Name : Mike Bicuspid
LAN Tested : Token Ring
Test File : 1024 kb
IO Blocks : 8192 b
Do Writes : Y
Sequential : N
R/W Ratio : 3
NT Blocks : 512 b
Test Length: 10 minutes
Variance : 20 seconds
Blank Int. : 15 minutes
Common Path: .
Data File : nb1912.dat
Batch File : batch.txt
Stats File : nb1912s.dat
Comment : NetBench test
Test Type : CPU Bandwidth
@@@
026 171 30901 180 0.00000 0.00000
027 80 14401 180 0.00000 0.00000
028 73 13201 180 0.00000 0.00000
029 81 14601 180 0.00000 0.00000
030 84 15201 180 0.00000 0.00000
036 136 24701 181 0.00000 0.00000
037 80 14401 180 0.00000 0.00000
038 96 17401 180 0.00000 0.00000
039 90 16501 182 0.00000 0.00000
040 87 15801 181 0.00000 0.00000
@@@
Sat Sep 26 17:06:06 1992
User Name : Mike Bicuspid
LAN Tested : Token Ring
Test File : 512 kb
IO Blocks : 8192 b
Do Writes : Y
Sequential : N
R/W Ratio : 3
NT Blocks : 1024 b
Test Length: 10 minutes
Variance : 20 seconds
Blank Int. : 15 minutes
Common Path: .
Data File : nb1912.dat
Batch File : batch.txt
Stats File : nb1912s.dat
Comment : NetBench test
Test Type : NIC Throughput
@@@
026 1419946 0 31200 180 0.00000 0.00000
027 414101 0 9200 182 0.00000 0.00000
028 525012 0 11600 181 0.00000 0.00000
029 519274 0 11600 183 0.00000 0.00000
030 418702 0 9200 180 0.00000 0.00000
036 1456355 0 32000 180 0.00000 0.00000
037 425098 0 9600 185 0.00000 0.00000
038 582542 0 12800 180 0.00000 0.00000
039 468114 0 10400 182 0.00000 0.00000
040 455111 0 10000 180 0.00000 0.00000
@@@
Sat Sep 26 17:11:11 1992
User Name : Mike Bicuspid
LAN Tested : Token Ring
Test File : 512 kb
IO Blocks : 8192 b
Do Writes : Y
Sequential : N
R/W Ratio : 3
NT Blocks : 1024 b
Test Length: 10 minutes
Variance : 20 seconds
Blank Int. : 15 minutes
Common Path: .
Data File : nb1912.dat
Batch File : batch.txt
Stats File : nb1912s.dat
Comment : NetBench test
Test Type : I/O Throughput
@@@
026 1193406 824 2472 181 0.2197 0.00000
027 728177 500 1500 180 0.3600 0.00000
028 780606 536 1608 180 0.3358 0.00000
029 798082 548 1644 180 0.3285 0.00000
030 675748 464 1392 180 0.3879 0.00000
036 1127219 774 2322 180 0.2326 0.00000
037 710701 488 1464 180 0.3689 0.00000
038 798082 548 1644 180 0.3285 0.00000
039 835948 574 1722 180 0.3136 0.00000
040 771868 530 1590 180 0.3396 0.00000
@@@
Sat Sep 26 17:15:31 1992
User Name : Mike Bicuspid
LAN Tested : Token Ring
Test File : 1024 kb
IO Blocks : 8192 b
Do Writes : Y
Sequential : N
R/W Ratio : 3
NT Blocks : 512 b
Test Length: 10 minutes
Variance : 20 seconds
Blank Int. : 15 minutes
Common Path: .
Data File : nb1912.dat
Batch File : batch.txt
Stats File : nb1912s.dat
Comment : NetBench test
Test Type : NIC Throughput
@@@
026 1601991 0 35200 180 0.00000 0.00000
028 540131 0 12000 182 0.00000 0.00000
029 561219 0 12400 181 0.00000 0.00000
030 527928 0 11600 180 0.00000 0.00000
036 1683659 0 37200 181 0.00000 0.00000
038 582542 0 12800 180 0.00000 0.00000
039 543116 0 12000 181 0.00000 0.00000
040 546133 0 12000 180 0.00000 0.00000
@@@
Sat Sep 26 17:19:51 1992
User Name : Mike Bicuspid
LAN Tested : Token Ring
Test File : 1024 kb
IO Blocks : 8192 b
Do Writes : Y
Sequential : N
R/W Ratio : 3
NT Blocks : 512 b
Test Length: 10 minutes
Variance : 20 seconds
Blank Int. : 15 minutes
Common Path: .
Data File : nb1912.dat
Batch File : batch.txt
Stats File : nb1912s.dat
Comment : NetBench test
Test Type : NIC Throughput
@@@
026 1818443 0 40400 182 0.00000 0.00000
029 720175 0 16000 182 0.00000 0.00000
030 669843 0 14800 181 0.00000 0.00000
036 1792282 0 39600 181 0.00000 0.00000
039 819200 0 18000 180 0.00000 0.00000
040 651739 0 14400 181 0.00000 0.00000
@@@
Sat Sep 26 17:24:11 1992
User Name : Mike Bicuspid
LAN Tested : Token Ring
Test File : 1024 kb
IO Blocks : 8192 b
Do Writes : Y
Sequential : N
R/W Ratio : 3
NT Blocks : 512 b
Test Length: 10 minutes
Variance : 20 seconds
Blank Int. : 15 minutes
Common Path: .
Data File : nb1912.dat
Batch File : batch.txt
Stats File : nb1912s.dat
Comment : NetBench test
Test Type : NIC Throughput
@@@
026 2038897 0 44800 180 0.00000 0.00000
030 1037653 0 22800 180 0.00000 0.00000
036 2166328 0 47600 180 0.00000 0.00000
040 1134276 0 25200 182 0.00000 0.00000
@@@
Sat Sep 26 17:29:11 1992
User Name : Mike Bicuspid
LAN Tested : Token Ring
Test File : 512 kb
IO Blocks : 8192 b
Do Writes : Y
Sequential : N
R/W Ratio : 3
NT Blocks : 1024 b
Test Length: 10 minutes
Variance : 20 seconds
Blank Int. : 15 minutes
Common Path: .
Data File : nb1912.dat
Batch File : batch.txt
Stats File : nb1912s.dat
Comment : NetBench test
Test Type : I/O Throughput
@@@
026 1808793 1242 3726 180 0.1449 0.00000
029 1179648 810 2430 180 0.2222 0.00000
030 1034012 710 2130 180 0.2535 0.00000
036 1721412 1182 3546 180 0.1523 0.00000
039 1034012 710 2130 180 0.2535 0.00000
040 1112655 764 2292 180 0.2356 0.00000
@@@
Sat Sep 26 17:34:11 1992
User Name : Mike Bicuspid
LAN Tested : Token Ring
Test File : 512 kb
IO Blocks : 8192 b
Do Writes : Y
Sequential : N
R/W Ratio : 3
NT Blocks : 1024 b
Test Length: 10 minutes
Variance : 20 seconds
Blank Int. : 15 minutes
Common Path: .
Data File : nb1912.dat
Batch File : batch.txt
Stats File : nb1912s.dat
Comment : NetBench test
Test Type : I/O Throughput
@@@
026 2236962 1536 4608 180 0.1172 0.00000
030 1430141 982 2946 180 0.1833 0.00000
036 2271914 1560 4680 180 0.1154 0.00000
040 1517522 1042 3126 180 0.1727 0.00000


Appendix E
Statistics File Output

Example

Sat Sep 26 16:52:14 1992
User Name : Mike Bicuspid
LAN Tested : Token Ring
Test File : 1024 kb
IO Blocks : 8192 b
Do Writes : Y
Sequential : N
R/W Ratio : 3
NT Blocks : 512 b
Test Length: 10 minutes
Variance : 20 seconds
Blank Int. : 15 minutes
Common Path: .
Data File : nb1912.dat
Batch File : batch.txt
Stats File : nb1912s.dat
Comment : NetBench test
Test Type : NIC Throughput
Total Throughput = 6591397
Number of Work Stations = 10
Sat Sep 26 16:57:21 1992
User Name : Mike Bicuspid
LAN Tested : Token Ring
Test File : 1024 kb
IO Blocks : 8192 b
Do Writes : Y
Sequential : N
R/W Ratio : 3
NT Blocks : 512 b
Test Length: 10 minutes
Variance : 20 seconds
Blank Int. : 15 minutes
Common Path: .
Data File : nb1912.dat
Batch File : batch.txt
Stats File : nb1912s.dat
Comment : NetBench test
Test Type : I/O Throughput
Total Throughput = 8293578
Number of Work Stations = 10
Sat Sep 26 17:01:41 1992
User Name : Mike Bicuspid
LAN Tested : Token Ring
Test File : 1024 kb
IO Blocks : 8192 b
Do Writes : Y
Sequential : N
R/W Ratio : 3
NT Blocks : 512 b
Test Length: 10 minutes
Variance : 20 seconds
Blank Int. : 15 minutes
Common Path: .
Data File : nb1912.dat
Batch File : batch.txt
Stats File : nb1912s.dat
Comment : NetBench test
Test Type : CPU Bandwidth
Total Throughput = 978
Number of Work Stations = 10
Sat Sep 26 17:06:06 1992
User Name : Mike Bicuspid
LAN Tested : Token Ring
Test File : 512 kb
IO Blocks : 8192 b
Do Writes : Y
Sequential : N
R/W Ratio : 3
NT Blocks : 1024 b
Test Length: 10 minutes
Variance : 20 seconds
Blank Int. : 15 minutes
Common Path: .
Data File : nb1912.dat
Batch File : batch.txt
Stats File : nb1912s.dat
Comment : NetBench test
Test Type : NIC Throughput
Total Throughput = 6684255
Number of Work Stations = 10
Sat Sep 26 17:11:11 1992
User Name : Mike Bicuspid
LAN Tested : Token Ring
Test File : 512 kb
IO Blocks : 8192 b
Do Writes : Y
Sequential : N
R/W Ratio : 3
NT Blocks : 1024 b
Test Length: 10 minutes
Variance : 20 seconds
Blank Int. : 15 minutes
Common Path: .
Data File : nb1912.dat
Batch File : batch.txt
Stats File : nb1912s.dat
Comment : NetBench test
Test Type : I/O Throughput
Total Throughput = 8419837
Number of Work Stations = 10
Sat Sep 26 17:15:31 1992
User Name : Mike Bicuspid
LAN Tested : Token Ring
Test File : 1024 kb
IO Blocks : 8192 b
Do Writes : Y
Sequential : N
R/W Ratio : 3
NT Blocks : 512 b
Test Length: 10 minutes
Variance : 20 seconds
Blank Int. : 15 minutes
Common Path: .
Data File : nb1912.dat
Batch File : batch.txt
Stats File : nb1912s.dat
Comment : NetBench test
Test Type : NIC Throughput
Total Throughput = 6586719
Number of Work Stations = 8
Sat Sep 26 17:19:51 1992
User Name : Mike Bicuspid
LAN Tested : Token Ring
Test File : 1024 kb
IO Blocks : 8192 b
Do Writes : Y
Sequential : N
R/W Ratio : 3
NT Blocks : 512 b
Test Length: 10 minutes
Variance : 20 seconds
Blank Int. : 15 minutes
Common Path: .
Data File : nb1912.dat
Batch File : batch.txt
Stats File : nb1912s.dat
Comment : NetBench test
Test Type : NIC Throughput
Total Throughput = 6471682
Number of Work Stations = 6
Sat Sep 26 17:24:11 1992
User Name : Mike Bicuspid
LAN Tested : Token Ring
Test File : 1024 kb
IO Blocks : 8192 b
Do Writes : Y
Sequential : N
R/W Ratio : 3
NT Blocks : 512 b
Test Length: 10 minutes
Variance : 20 seconds
Blank Int. : 15 minutes
Common Path: .
Data File : nb1912.dat
Batch File : batch.txt
Stats File : nb1912s.dat
Comment : NetBench test
Test Type : NIC Throughput
Total Throughput = 6377154
Number of Work Stations = 4
Sat Sep 26 17:29:11 1992
User Name : Mike Bicuspid
LAN Tested : Token Ring
Test File : 512 kb
IO Blocks : 8192 b
Do Writes : Y
Sequential : N
R/W Ratio : 3
NT Blocks : 1024 b
Test Length: 10 minutes
Variance : 20 seconds
Blank Int. : 15 minutes
Common Path: .
Data File : nb1912.dat
Batch File : batch.txt
Stats File : nb1912s.dat
Comment : NetBench test
Test Type : I/O Throughput
Total Throughput = 7890532
Number of Work Stations = 6
Sat Sep 26 17:34:11 1992
User Name : Mike Bicuspid
LAN Tested : Token Ring
Test File : 512 kb
IO Blocks : 8192 b
Do Writes : Y
Sequential : N
R/W Ratio : 3
NT Blocks : 1024 b
Test Length: 10 minutes
Variance : 20 seconds
Blank Int. : 15 minutes
Common Path: .
Data File : nb1912.dat
Batch File : batch.txt
Stats File : nb1912s.dat
Comment : NetBench test
Test Type : I/O Throughput
Total Throughput = 7456539
Number of Work Stations = 4


Appendix F
Technical Support

NetBench is available free of charge on 3 1/2" diskettes. If you would like to
receive a copy of NetBench or any other Ziff-Davis benchmark, send your
name, address, and phone number to the Ziff-Davis Benchmark Operation
(ZDBOp) benchmark request FAX number (919-380-2879) using the form in
this appendix. The benchmark will arrive via third-class US Mail.

If you want to receive this benchmark sooner and you have a Federal Express
account, include your account number and shipping instructions on the FAX
form, and the benchmark will arrive via Federal Express.

Contacting ZDBOp
If you have comments or questions about NetBench, please contact ZDBOp.
You can contact ZDBOp in several ways:
* If you have a modem and communications software, you can use
CompuServe* to access ZiffNet, the Ziff-Davis on-line service. The
ZiffNet forum is a way for you to communicate your questions and
comments to Ziff-Davis. (Access to CompuServe is available for a fee.)

If you are a member of CompuServe, simply type GO ZIFFNET at the
CompuServe prompt. If you are not a member of CompuServe, you can
access ZiffNet following the steps below:

1. Set your communications software to seven data bits, even parity,
one stop bit. Select a data transfer rate (bits per second) of 1200,
2400, or 9600. Using your modem, call the local ZiffNet number.
To find your local ZiffNet number, dial 800-346-3247 (by
modem) or 800-635-6225 (by voice). If you login by modem, at
the user ID prompt, type PHONES.

2. Respond to the prompts by entering the information below:
When you connect:
Host Name: CIS
User ID: 177000,5555
Password: ZIFF*NET
Agreement Number: MACMAN2
Register your name and credit card number for billing.

3. You will receive a Personal User ID and Temporary Password
on-line. Write them down and use them to log on. You will get a
permanent password by mail within 10 business days.

* You can FAX your questions and comments directly to NetBench
Technical Support at the ZDBOp benchmark FAX number (919-380-2879).

* You can mail your questions and comments to ZDBOp at the following
address:

Ziff-Davis Benchmark Operation
One Copley Parkway, Suite 510
Morrisville, North Carolina 27560
Attn.: NetBench Technical Support


Benchmark Request Form

FAX OR MAIL THIS FORM TO:
The Ziff-Davis Benchmark Operation

One Copley Parkway, Suite 510
Morrisville, NC 27560
FAX: (919) 380-2879
for assistance call (919) 380-2800

WHO I AM:
Name: ______________________________________________________
Company: ____________________________________________________
Address: _____________________________________________________
City: _________________________ State: ________ Zip: ______________
Phone Number: ________________________________________________
Fax: _________________________________________________________

We answer requests in the order that we receive them. We ship all benchmarks
via third-class US Mail unless you supply a Federal Express account number.
Please allow four to six weeks for delivery via regular mail.

Your Federal Express account number: ______________________________
Check one: ___ priority overnight ___standard overnight

WHAT I WANT:
___ PC Bench* 8.0
___ WinBench* 4.0
___ Winstone* 1.0 (available only on CD-ROM)
___ NetBench* 2.1
___ ServerBench* 1.0
___ MacBench* 1.0

(All benchmarks EXCEPT Winstone are available only on 3.5" diskettes.)


Glossary

Batch Run File
This is an optional text file which contains NetBench commands. It is
designed to run a series of NetBench tests unattended.

NIC
Network interface card.

Server
A computer containing the physical drive that contains the NetBench files. Test
stations reference this common logical drive in order to run NetBench.

Test station
A DOS computer running NetBench. Each computer waits to be triggered by the
Test Station Coordinator to begin the test. After the test is completed,
the results are collected by the Test Station Coordinator. Each test station
is identified with a unique user-assigned identifier called WS.ID. A PC
workstation must be placed in NetBench "waiting mode," and be physically
connected to the network to be considered a test station by NetBench.

Test Station Coordinator
One test station computer running NetBench (selected by the user) to
synchronize the test and collect test station results. The Test Station
Coordinator also acts as a test station. It performs the same tests as
the other test stations and then reactivates as the coordinating test
station at the completion of each NetBench test.

WS.ID
Each test station must a have local C: drive. Each test station must also
have a file in the root of the local C: drive called WS.ID. This file
must contain a unique three digit test station identifier in position 1 of
the file (no leading spaces or line feeds before the identifier). Example: 001.
For administrative purposes, it is a good idea to sequentially number each
test station across the network. In addition, this file may also contain a
seven-digit alpha-numeric Group identifier. If this is used, it must be
separated from the test station ID by at least one space. Example: 001
GroupA1. Group identifiers are not case sensitive, but characters other than
numeric and alpha must not be used. When a Batch Run File is used, Group
identifiers are referenced from the batch file.


Licensed Material (C) Ziff-Davis Publishing Company, L.P.


Copyright (C) 1992, 1993 by Ziff-Davis Publishing Company, L.P.
All rights reserved.
Manual release date: August 1993 with Version 2.10 of NetBench



  3 Responses to “Category : Network Files
Archive   : NETBENCH.ZIP
Filename : NETBENCH.TXT

  1. Very nice! Thank you for this wonderful archive. I wonder why I found it only now. Long live the BBS file archives!

  2. This is so awesome! 😀 I’d be cool if you could download an entire archive of this at once, though.

  3. But one thing that puzzles me is the “mtswslnkmcjklsdlsbdmMICROSOFT” string. There is an article about it here. It is definitely worth a read: http://www.os2museum.com/wp/mtswslnk/