Dec 072017
Ashton Tate dBase IV Tech Notes for July 1991. | |||
---|---|---|---|
File Name | File Size | Zip Size | Zip Type |
TNDB0791.TXT | 79514 | 24293 | deflated |
Download File TN9107.ZIP Here
Contents of the TNDB0791.TXT file
1 READ ME
These articles are reprinted from the July 1991 edition of TechNotes/dBASE
IV. Due to the limitations of this media, certain graphic elements such as
screen shots, illustrations and some tables have been omitted. Where
possible, reference to such items has been deleted. As a result,
continuity may be compromised.
TechNotes is a monthly publication from the Ashton-Tate Software Support
Center. For subscription information, call 800-545-9364.
2 ABCs of UDFs
ABCs of UDFs
Martin Leon
Almost everyone has heard about User Defined Functions. If you've read
TechNotes for any length of time, then you've probably seen a number of
UDFs provided in the UDF Library column. You may have heard that they
provide a great enhancement to the already very versatile dBASE language.
But do you need UDFs? Do you know how to make and implement your own
UDFs? This article will provide some basics as well as some tips on using
UDFs most effectively.
dBASE IV Functions
dBASE IV comes loaded with functions. Think of functions as messengers
that take your question and always bring back a response, a return value.
For example, you ask dBASE IV what today's date is by using the DATE()
function:
? DATE()
the value it returns is a date type variable. This function does not
require any arguments to provide you with the information you are asking
for.
"Is this the right room for an argument?"
"Is not!", "Is too!", "This is just a contraction. This isn't an
argument.", "...can be.", "Is NOT!", "It certainly is..."
Although this may be the type of argument you are used to, when it comes to
a UDF, an argument takes on a more constructive roll. Think of arguments
as additional pieces of information used to convey to a dBASE function
exactly what it is that you are asking for. Suppose you want to know what
day of the week a particular date falls on. You would use the CDOW()
function. This function needs to be told what date to use to be able to
tell you what day of the week it is. It expects a date-type argument and
returns a character string representing the day of the week ("Sunday"
through "Saturday") that date falls on. To find out what day of the week
today is, enter the following at the dot prompt:
? CDOW(DATE())
Some functions require more than one argument to be able to do their
defined tasks. For example, the MAX function will tell you which of two
arguments is greater.
? MAX(1, 100)
would return a value of 100. Obviously, if you ask dBASE IV to compare two
values, you must provide two arguments for it to compare.
dBASE IV has a very extensive set of commands and functions, but even if it
had twice as many, there would always come an instance where there may not
be a precise function for what you want to do.
UDF Definition
We'll start with a simple example. Let's say you want a function that will
tell you how much sales tax will be added to a particular purchase.
Figuring out tax is such a simple operation that it doesn't really require
a UDF, but we'll use it to show the basic elements of a UDF. The sales tax
in this neck of the woods works out to 7%. Defining a function is easy.
If you want to use the dBASE editor, from the dot prompt, enter
MODIFY COMMAND TAX
The first executable line of this UDF would be:
FUNCTION Tax
This command tells dBASE IV that you are about to define a function named
Tax(). The next step is to decide if it needs any arguments. Arguments
are optional but since the amount of tax is dependent on the amount of the
sale, we must use the amount of sale as an argument to the Tax() function.
In other words, if we want our function to tell us how much tax is due, we
must tell it the amount of the sale on which to compute the tax. On the
receiving side, arguments are called parameters. For a function to be able
to receive the information you are passing to it in the form of arguments,
the function must be defined with corresponding parameters. The number of
arguments used when calling the function must match the number of
parameters defined in the function. Otherwise, dBASE IV will tell you that
you called the function with the wrong number of parameters. To define a
parameter, use the PARAMETER command immediately after the FUNCTION
declaration:
FUNCTION Tax
PARAMETER mSale amt
This defines a single parameter called mSale amt. This parameter is a
memory variable that will be initially set to the value we pass as an
argument. Because we want to perform a numeric calculation on the amount
of sale, the argument we use should be a numeric value, as opposed to a
character, date, or logical value. Although you can use other data types
as arguments or return values, this particular function requires a numeric
value. This memory variable/parameter is "local" to the function: no other
program or function can "see" it or modify it. Also, all parameters and
all other memory variables defined in a function are automatically released
when the function is done with its task. Of course, memory variables that
have previously been declared public are an exception.
Whether you use any parameters or not is up to you, but if you define
parameters, you must call the function with the corresponding number of
arguments. A parameter cannot be optional in a UDF like it sometimes can
in a regular dBASE function. If your UDF calls for it, you must pass
something. You can pass a null string ("") or 0 to indicate to your UDF a
certain condition.
Finally, since this is such a simple calculation, the last step is to send
back the amount of sales tax.
FUNCTION Tax
PARAMETER mSale amt
RETURN mSale amt * .07
The RETURN command provides a special function in a UDF in that it can
relay a response from the calculation performed therein. Any expression
in the RETURN command is evaluated first and the result is what is sent
back as the answer. In a regular program or procedure, the RETURN command
does not allow any other arguments that return a value. In this case,
however, the RETURN command figures out the amount of tax due using the
formula we gave and sends the result back as it's response to the question
"How much tax on this amount?"
Suppose we want to make this UDF more generic, more flexible. Let's make
it so that we need to provide both the amount of sale and the amount of tax
two arguments. The reason this is more flexible is that we don't need to
modify the UDF if we have to deal with a different tax rate you just
change the value of the argument that provides the tax rate. Also, let's
suppose we don't want to figure out the decimal version of the tax
percentage we want to just say 7 as the percentage. Then we need to also
tell the UDF to do the conversion for us:
FUNCTION Tax
PARAMETERS mSale amt, mTaxperct
RETURN mSale amt * (mTaxperct / 100)
Now when we call this function, the value of the first argument is stored
in mSale amt, and the value of the second argument is stored in mTaxperct.
The RETURN command divides the value of mTaxperct by 100 and multiplies
that result by mSale amt. The final result is what is sent back as the
reply.
The result of this function would be a return value of 0.7.
? Tax(10, 7)
In short, this is all there is to a UDF. You define it, define its
parameters (if any), do some processing and return a value. In this simple
UDF, the RETURN command does the processing and returns the resulting
value. It evaluates the mathematical expression and sends its result back
as the answer. Let's look at a UDF that would require more than just three
lines of code.
The DBFSize UDF will tell us the exact size of the currently active
database file. In order to do this we need to determine the size of the
database header and then add to that the result of multiplying the size of
each record by the number of records.
Database size = Size of database header + (# of records * size of each record)
There are dBASE IV functions that tell us the number of records and the
size of the records, but none that will tell us the size of the header.
The size of the header is figured by multiplying the number of fields by 32
and adding 34 to the result.
Database header size = (# of fields * 32) + 34
There isn't a dBASE function that will tell us how many fields there are in
a database, so it's going to take more than a simple expression to do all
this. Without getting in to a lesson in programming, the UDF on the next
page accomplishes this.
FUNCTION DBFSize
mFcount = 0
* This loop counts the fields in the database.
DO WHILE "" # FIELD(mFcount + 1)
mFcount = mFcount + 1
ENDDO
mHeadsiz = 32 * mFcount + 34
RETURN mHeadsiz + (RECSIZE() * RECCOUNT())
We don't use any parameters with this function because we want it to work
with whatever database is currently in use. We could, if we wanted, define
a parameter to receive a database name and to give us the size of that
particular database. Try to incorporate that change yourself for practice.
Notice that you can use dBASE IV functions and programming structures into
your own functions. In fact, you can even incorporate other UDFs into your
UDFs.
For those of you familiar with dBASE IV procedures, there's basically only
one difference between a UDF and a procedurea UDF must return a value,
even if that value is 0 or a null string. One major advantage of a UDF
over a procedure is that you can call a UDF any place that you could call a
dBASE functionthere are no DO commands involved. It's just like extending
the dBASE language.
Accessibility of UDFs
Okay, so now you've created a UDF, but how does dBASE IV know that you've
created one? There are several ways. One option is that you can create a
library of functions and SET PROCEDURE to that library to have access to
all of the functions in that library. A library of functions would be a
single program file with several functions defined in it. By issuing the
command
SET PROCEDURE TO
you would have access to all of the functions defined in that file. This
file can be created using MODIFY COMMAND at the dot prompt or we could
create a dBASE Program (not an application) from the Application panel in
the Control Center. If you already have a number of UDFs scattered about
in your directory and would like to consolidate them under one file, open a
new file using MODIFY COMMAND and use the shortcut, Ctrl-K R to read in the
different files. Within this procedure, all your UDFs claim their own
territory by their opening and closing statements which will always start
with the command FUNCTION and end with the command RETURN with a value.
Otherwise, you can type in your functions, one beneath the other as you are
inspired to create them.
For example, you could create a dBASE program called MyFunc that has all of
your UDFs in it:
*Myfunc.PRG
FUNCTION Tax
RETURN
FUNCTION DBFSize
RETURN
FUNCTION SomeOthr
RETURN
* and so on with all of your functions
* End of Myfunc.PRG
After saving it, you would go to the dot prompt or to some other program
where you need these functions and type:
SET PROCEDURE TO Myfunc
If everything compiles fine, you will now have access to any of the UDFs in
Myfunc. The problem with this method is that you must remember to use the
SET PROCEDURE command to be able to use your functions.
Another option is to put the UDF in a program file of the same name. For
our Tax() example, we would create a program (from the dot prompt or the
Control Center) called Tax. It would contain the definition for a single
function, the Tax() function. Whenever you make a call to the Tax()
function, dBASE IV will first look to see if it has such a command itself.
Verifying that it doesn't, dBASE IV will next check to see if the current
PROCEDURE file has this function defined in it somewhere. If that's not
the case, dBASE IV scans the current directory for the file TAX.DBO (the
compiled version of the .PRG. If it hasn't been compiled, dBASE IV will
compile TAX.PRG automatically). Upon finding it, dBASE IV loads it in to
memory and runs it. Although this method frees us from having to remember
to SET PROCEDURE, it is technically slower because it will always involve
hard disk access time the first time you use the function and may require
additional disk access for successive calls to the function. This
difference in speed may be negligible. However, when developing a UDF,
this method may be easier since you will only have one UDF in the file and
won't have to go hunting for the UDF you are trying to perfect. You decide
which method you like better.
For programmers there is an additional option: put the function definition
in your main routine and it will be accessible to all sub-routines and
procedures and to other UDFs.
Once you implement any of these options, your UDF is available to dBASE IV
and could potentially be used anywhere that a dBASE IV function is used.
Limitations with UDFs
As mentioned earlier, you can use any of the existing dBASE IV commands,
functions, or programming structures in your UDFs. From that perspective,
there is no limit to what can be done in a UDF. What you can do in a UDF
is probably going to be most limited by the context that you try to use it
in.
As you design your UDF, you must remember where it is that you will be
calling this UDF from and how it is that you will be calling it. If you
look at the Tax() example, you'll notice that it basically just performs a
mathematical calculation. There is no need to move a record pointer in a
database, open another database, change work areas, modify the information
on the screen, open or close a window or any other complex operation. It's
a very safe UDF. You could use it in a format screen, a report form, as a
calculated field in a query, or from the dot prompt as part of a LIST
command. If you SET DBTRAP OFF, you could even use Tax() as part of an
index expression.
Warning: If you want to incorporate your UDF in to these other parts of
dBASE IV, bear in mind that you are now sharing control in how dBASE IV
operates. And when you set DBTRAP OFF, you're obliged to be responsible in
your coding or otherwise run the risk of encountering potentially fatal
errors that can hang the program.
Next month, we'll expand the functionality of UDFs by using them in Reports
and Screen Forms.
3 Manual Overwrite
Manual Overwrite
When ? Is not Like @...SAY
Joe Stolz
If you've ever perused the Language Reference, you may have noticed the two
commands located at the very beginning: @.SAY and ?/??. Both of these
commands can display information to the screen, to a printer or to a file.
@.SAY seems most adept at printing information at a specific location on
the screen or on paper since the exact coordinates of the desired text or
field will be used for precise placement on the page. The ? command seems
a poor substitute for @.SAY. However, you may have also noticed that dBASE
IV has enhanced the abilities of ? beyond those of dBASE III PLUS and added
some unusual clauses; for example, AT and STYLE. Just what is going on
with ? and when should one use one over the other?
Let's Review Some Facts
In the days of dBASE II when you practically lived at the dot prompt, the ?
command was one of the best ways to communicate with the dBASE II program.
For example, to perform an equation you would type
? 1 + 1
and dBASE II would return 2. To verify even the most simplistic arithmetic
operation, you might type
? 5 = 5
to which the reassuring response is .T.. The ? command was loosely
translated as "What is" in the documentation.
Perhaps in those days, as today, you were also perplexed about the fact
that two commands SET PRINT ON and SET DEVICE TO PRINT (really, it was SET
FORMAT TO PRINT in dBASE II) existed. Well let's clear up these problems
once and for all and learn something in the process.
In a nutshell, SET DEVICE TO goes with @.SAY and SET PRINT goes with ?
commands. However, where SET DEVICE determines to which device @.SAY
commands will be directed, SET PRINT effects a wide range of commands,
directing them all to the printer if you SET PRINT ON. Commands that will
be printed are ones that normally display to the screen in a scrolling
fashion like LIST, DISPLAY MEMORY and any command that echoes to the screen
when SET ECHO is ON.
For formatting purposes, many programmers naturally leaned toward the use
of @.SAY commands because you could specify precise locations on the screen
or paper where you wanted output to appear. Originally, the ? command gave
no such luxury. In dBASE IV, much more octane was added to this
fundamental command. The addition of the AT clause for example, allowed
you to specify the column position where output would appear.
If you dare, view the contents of one of your report .FRG files or label
.LBG files. You will see (if you are programmatically inclined) that all
lines in the report that are sent to screen or printer do so through a
series of ? and ?? commands, often using the AT clause. If @.SAY is so
easy to use and predictable, why are reports using ? commands?
One reason is that a ? command better lends itself to the variable nature
of a dBASE report which could conceivably be the template for a number of
files. @.SAY commands can be directed to the screen in any way desired and
still display a pretty picture. However, since printers cannot back up to
the previous line (they print from top to bottom on a page, a line at a
time), an @.SAY directed to line 10 followed by one to line 4 will cause a
page eject and then a movement to line 4 on the following page. This can
be annoying to programmers but is entirely logical given the abilities of
printers.
Needless to say, @.SAY commands can still do a masterful job at printing a
well laid out page. Probably the overriding factor about reports that
lends them so well to ? type printing is that reports are usually very line
oriented. In a report, each detail record takes a line by itself. If
there are ten detail records, there will be ten lines. This is easily
handled by a series of ten ? commands, or by a DO loop that performs a ?
ten times. This isn't quite the same with @.SAY commands, though they too
can accommodate an unpredictable number of lines/records by referencing
line coordinates with a memory variable rather than a numeric constant.
Still the variable would have the potential to cause a page eject if a
print row was a value less than the previously stated print row. Thus, it
is obvious that using @..SAY would require more maintenance.
Streaming Output
dBASE IV introduced a new type of printing terminology: streaming output.
Streaming output is what is sometimes referred to as linear - line by
line. Streaming output includes ? but not, under any circumstances, does
it include @.SAY output. This causes much chagrin but is an absolute fact
and must be accepted.
The ? commands are well suited to "streaming output" which is naturally
line after line, represented by ? after ? in your report program. It seems
a bit more adapted to unpredictable printing jobs.
Be that as it may, dBASE IV has clearly chosen ? to be the work horse of
printing. Let's look a very fundamental aspect of ? printing and we will
see that ? is undoubtedly the best choice in printing jobs.
So what does streaming output do for you? New advanced printing functions
are available to print jobs that are streaming output oriented. These
include BEGIN PRINTJOB, ON PAGE AT LINE, and system memory variables like
pcopies, pscode, and pecode. A PRINTJOB is a series of commands and
procedures that together make up a print routine. If the variable pcopies
is set to 2 or more, the PRINTJOB will create a spool copy of the routine
and run a quick second copy (or more as needed). PRINTJOBs offer a means
to consolidate all operations that are relevant to printing, including some
that are hardware and environment dependent. Space doesn't permit an
in-depth discussion of print jobs but to re-emphasize the point, PRINTJOB
is for streaming output commands only and @.SAY is not a streaming output
command.
The other important aspect of ? that is unique to it is the STYLE clause
that can be applied. Again, this is a uniquely print-oriented feature.
STYLE offers certain standard type styles that can be applied to the text
that is being output by the ? command. These are Bold, Underline, Italics,
Raised and Lowered (if your printer supports these fonts). Further, each
printer can support five fonts that are unique to the type of printer you
use. STYLE also offers the only means to access these fonts via a command
in dBASE IV. Again, @.SAY doesn't offer this ability when printing either.
As if to totally tip the scale in favor of ? and streaming output,
PostScript must be printed through ? commands and not through @.SAY
commands. Due to the uniqueness of PostScript printing, ? commands
facilitate the process. If @.SAY commands are used to print with the
PostScript driver, gyrations must be performed to get line feeds and form
feeds. This a needlessly difficult way to print when ? makes a PostScript
interface as painless as printing to any other printer type.
That means that ?/?? appears to be the winner when it comes to printing.
If any doubt remains that this is the premier way to print, check out the
Language Reference on the commands discussed and see what they can do for
you!
4 The Theory of Relativity
The Theory of Relativity
(as applied to the PROTECT command)
Loretta A. Steele
Once there was a little mom-and-pop operation specializing in cheap
furniture and was appropriately named The Cheap Furniture Store.
Throughout the first twenty or so years, they fared marginally well. In
the eighties, it seemed that they would have to go belly up. No one was
buying anything cheap. And then the nineties hit and suddenly, cheap
furniture was the in thing. Everybody's belt was so tight, they had to add
extra notches and carry oxygen tanks. It was then that The Cheap Furniture
Store's business flourished. They expanded their staff, computerized their
operation and changed their name to something that sounds Scandinavian.
As fate would have it, their computer operations were managed by dBASE IV.
As fate would also have it, they hired an employee who was a little too
smart for his own good. Before they knew it, they had their own
embezzlement scandal. They lost some money and presumably, some faith in
their fellow man. But times had changed drastically since that memorable
day when they hung their first GRAND OPENING sign. Computers had made
thievery an exquisite art and no one was safe. It became apparent that a
security system was necessary to protect the assets and confidential
information that used to be in a locked file cabinet. The computer age had
reduced all of this to a very vulnerable state. So they called the same
brilliant consultant that set up their system initially, to ask him what
software or hardware they must now purchase to safeguard their computerized
investment. He told them that the capability was already there and that
implementing it would be no problem. (Of course, he was a consultant.)
It is from here, we will explore the considerations necessary for ensuring
the security of their accounting database files. Because of the job
requirements of the different store employees, the database files that make
up the store accounting system must have some level of restricted access.
The questions then concern how the database files are to be protected from
unauthorized access by store employees and how the fields within the
databases files are to be secured.
The job requirements of the furniture store employees in relation to the
accounting database files will be the focus of establishing the protected
accounting system. The makeup of the staff consists of the store manager,
payroll clerks, account receivable clerks, order entry clerks and an
outside accountant. The database files to be protected are: EMPLOYEE.DBF,
ORDERS.DBF, ORDERS.DBT and ACCT REC.DBF. These files are found on the
Samples Disk that is included with the dBASE IV package. As you go through
this tutorial, you can follow along on your system using these files.
The PROTECT utility includes a user log-in security, database file and
field access level security, and data encryption of database, memo, and
index files. You can use PROTECT just for the log-in security without
involving access levels and data encryption. If you choose to use the
access levels and data encryption, PROTECT will always control the access
to the encrypted files.
Privileged Information
The log-in security is a password-protected feature. A user must enter
their correct group name, user name and password before they can enter
dBASE IV. The three items are collectively called the User Profile which
is maintained in the Dbsystem.DB file created and maintained in the dBASE
directory. If a user fails to correctly enter their user profile after
three tries, the PROTECT system will return them to the operating system
prompt. The log-in security is protected regardless of what directory or
drive dBASE IV is invoked from. The Dbsystem.DB file is an encrypted file
so no one can discover its stored user profiles. If erased or damaged, it
can effectively lock you out of ever accessing your data unless you have
unencrypted backups or a reliable backup of the Dbsystem.DB file itself.
The Access Level security controls the usage of the database files and the
fields within those files. Each user is assigned an access level, which
gives them specific privileges or establishes what the user can do with the
file and its fields. These file and field privileges are called the users'
privilege scheme. The first part of the privilege scheme is File access
privileges. These privileges consist of Read, Update, Delete and Extend.
The Read privilege gives the user the right to view records within a
database file. The Update privilege allows the user to edit or change the
contents of records. The Extend privilege gives the user the right to
append or add records to the database file. The Delete privilege grants
the user the right to delete records from a database.
The second part of the privilege scheme is Field Access Privileges. These
privileges are: Full, Read Only (or R/O) and None.
Full privilege authorization means a user can read and write to a field.
If a user has the Read Only or R/O privilege, they can read a field but
can't edit or change the contents of a field. The None privilege means the
user will not be able to view that field and, since it isn't visible, they
can not edit the field.
The final feature of the Protect system is data encryption. The database,
memo and index files can be encoded to prevent the reading of their
contents. Only by logging into a PROTECTed dBASE IV system can users
access and read these files and only through the features of a PROTECTed
system can these files be decrypted and made readable outside of a
non-protected dBASE IV system. The encryption process is automatic once
access level security information is saved.
The Store Manager must have access to all files and fields within all
databases. The Manager will be assigned an user access level of 1 on each
database file. An example of the Manager's user profile for the
EMPLOYEE.DBF file follows:
Group name: STAFF
User name: MANAGER
File name: EMPLOYEE
Password: ALLRIGHT
Access level: 1
Clerical Responsibilities
The Payroll Clerks only deal with the EMPLOYEE.DBF database file. The
majority of the fields in this file are edited by the Payroll Clerks. But
there are a few fields that the Clerks can not change. With this in mind,
a user access level of 4 is assigned to Payroll Clerks. Because of their
restricted access, they obviously cannot be given the same access level as
the Manager. On the other hand, an access level of 8 would be too
restrictive. The number 4 is midway between the least restrictive access
level and the most restrictive level. There is nothing magical about the
number 4. Assigning access levels is based on a sliding scale. An access
level number is only significant in relation to the other access levels
you've set. What we really need to focus on is the Payroll Clerks access
in relation to the Store Manager. It's obvious that the Clerks must not
have the same access as the Manager. In fact, it was real obvious when
that whole nasty embezzlement scandal happened.
We also know that the Clerks cannot be completely denied access to the
EMPLOYEE file. This leaves us with a range of access level numbers from 2
to 7. Any number in this range would work if we are only considering a
Store Manager and Payroll Clerks. But our system will also involve an
Accountant who needs access to the EMPLOYEE file. Since we have a range to
play with, one approach during the initial assignment of access levels is
to choose a mid-range number where possible. This method gives you some
flexibility during your assignments. Don't box yourself in by trying to
assign levels in strict sequential order. Spread the numbers out, give
yourself some breathing room. Always focus on each users needs in relation
to the other users.
The Order Entry Clerks must use the ORDERS.DBF database. They are assigned
a user profile access level of 4. These Clerks will have limited access to
the ORDERS file. However, they must not be completely restricted. An
assignment of 4 gives us leeway, just as in the case of the Payroll
Clerks. We still have an Accountant who needs access to the ORDERS file.
So in relation to the Store Manager's access level of 1 and a very
restrictive level for the Accountant, the Order Entry Clerks can fit midway
on our sliding scale of access levels.
Now for the Accounts Receivable Clerks (A/R Clerks). They must have access
to the ACCT REC.DBF and ORDERS.DBF files. In dealing with the ACCT REC.DBF
file, the A/R Clerks will enter and maintain all fields. Their access
level will be a 5. You might be wondering why their level is only 5.
After all, they must have access to all of the fields. Wouldn't it be more
logical to give the least restrictive level of 1? In an isolated
circumstance, that reasoning would be correct. The A/R Clerks deal with
two database files. We need to make an assignment that will encompass both
files.
In the ORDERS file, the A/R Clerks only need to read the information but
never alter it. This is a very restrictive access. But it is not the most
extreme access. The most extreme access would be no access at all. We
wanted to make one assignment for the A/R Clerks that would reflect their
needs in both the ACCT REC and ORDERS files. If we had given them an
access level of 1 in ACCT REC because they needed to maintain all the
fields in that file that would not have reflected the restrictions placed
on the ORDERS file of read-only. We could go back to our comfortable
midway assignment of 4. That would reflect the needs of the Clerks in both
files but we should give the A/R Clerks a more restrictive assignment. The
Payroll and Order Entry Clerks all have an assignment of 4 but they are not
as restricted in their respective files as are the A/R Clerks with the
ORDERS file. Therefore, the access level of 5 was based on that
reasoning. A/R Clerks need access to all fields in ACC REC but they need
more restrictive use of the ORDERS file. Their access level in both files
will be a 5.
The Outside Accountant
The final user is the outside Accountant. The Accountant must have access
to all the files (EMPLOYEE.DBF, ORDERS.DBF, and ACCT REC.DBF). However,
the access of the fields within those files will have some strong
limitations. The Accountant's access must be more limited than any of the
staff. At this point the most restrictive assignment we may have is an
access level of 5. Therefore, in relation to the Accountant, we have a
range of 6 to 8. The Accountant will have an access level of 7. The
numbers 6, 7, or 8 would have served our purpose but our philosophy is to
spread the numbers out when possible. If 6 will work, so will 7. Access
level assignment is a sliding scale not a strict fixed scale. Always
consider the numbers in relation to the other assignments. Spread the
numbers out, be flexible.
Going by the Book
We're approaching the final stages of creating a PROTECTed database system,
having completed security request forms (a form for this purpose is found
in the dBASE IV manual, page 14-41 of Using the Menu System in the version
1.1 package. In version 1.0, it is found in Networking with dBASE IV, but
really, why haven't you upgraded?) for all users and groups. We've
specified the appropriate user profiles and privilege schemes. What's
left? Only the task of entering this information into the dBASE IV PROTECT
Menu remains. This will create a Dbsystem.DB encrypted file to contain
user profile data and then encrypt our database and memo files.
We will start by backing up any original database and memo files for
safekeeping. Maintaining regular backups are very important but especially
so when working with encrypted files. Should some sort of corruption
render your file in a perpetually locked state, oblivious to passwords, a
reliable backup will be your only option. You must SET ENCRYPTION OFF
before COPYing a file to a backup. Copying an encrypted database through
DOS will not be a reliable backup should your password system become
corrupted.
The next step will be to create a new subdirectory called PROTECT, beneath
the DBASE subdirectory, copying all files to be protected into that
directory. Step three is to get a printout of all database file structures
that will be used in PROTECT for documentation purposes. Each file will
exist in three separate places: One in the original DBASE subdirectory,
the second on floppy disks, cartridge or tape, and the third set in a
separate PROTECT subdirectory. The files in the PROTECT directory are the
ones we will use in establishing a protected dBASE IV accounting system.
Using the printouts of the database structures and security request forms,
we're ready to enter the PROTECT menus and record user profiles and
privilege schemes. When you start PROTECT, you must enter an Administrator
password. The password should be one that's easy to remember since it is
your only access to a PROTECTed system, once it has been established. If
you forget the password, there is no way of retrieving it later, short of
eliminating the DbSystem.DB file and all encrypted files and starting
over. So store this password in a safe place.
Entering PROTECT Mode
The Protect Menu has four options: Users, Files, Reports and Exit. The
Users option is where we create user profiles which are maintained in the
Dbsystem.DB file. To create a user profile, we must give each user a login
name, password, group name and access level. A full name is optional, it
is used for my benefit to describe in more detail the identity of the
user. The most important assignment in a user profile is the access
level. This level is used in the File menu to establish the type of access
the user will have to a database file and its fields. After completing a
users' profile, we move the cursor down to the 'Store user profile' and
press the Enter key before proceeding with the next user.
User Profiles
We now move to the Files options after entering all my user profiles. Here
we will set up group access by file and field privileges. We will closely
follow the data recorded on the Security Request Forms. The privilege
schemes I create will be saved in the database file structure. Each
database is associated with only one group but one group may have many
associated database files. The two most important sections of this menu
are: File access privileges and Field access privileges. The File access
privileges are those values which are the most restrictive for all users in
the system. Looking over the completed Security Request Forms, enter the
access levels for Read, Update, Extend and Delete.
The 'Field access privileges' relate to the individual access levels from
the Users menu. This is where each field in the database is assigned:
Full, None or R/O (read only) status. After completing the first access
level privileges for a database file, instead of proceeding to the Store
file privileges option, we back up to Access level and proceed to enter the
next level for this database. Continue until all the access levels for a
particular database file have been entered. At that point, you can then
select the Store file privileges option.
Get It in Writing
The next menu option, Reports, is very important. Since the Dbsystem.DB
contains the security status of each user and is an encrypted file, you
will want to have a hard copy of this security information. The User
Information Report lists the user name, password, group, fullname and
access level for each user. The File Information Report gives you the
group name, file access privileges and field privileges. Print this
immediately after the database file is encrypted. Encryption is done
through the Save feature in the Exit menu option. It is important to print
this information entirely; primarily, to verify the accuracy of your
protected system and, secondly, as a hard copy record of your protect
system for documentation and maintenance purposes. Lastly, it may sound
like a broken record but don't forget to save your password in a secure
place.
The Exit option has three features: Save (new and updated user profiles and
privileges schemes - .DBF converted to .CRP files and .DBT become .CRT),
Abandon (cancel all new and updated user profiles and privileges schemes)
and Exit (new and updated user profiles and privileges schemes are
encrypted on the database files if not already done through the Save
option.)
We now exit the PROTECT menu and return to the Control Center or dot
prompt. Before we can test our protected system, we need to rename the
newly encrypted .CRP files to .DBF files. Likewise for encrypted memo
files (.CRT extension) to .DBT memo files. Also, it is imperative to
re-index your databases as well.
Going for a Test Drive
Logging in as the Store Manager, we can test for the accuracy of the group
name, user name and password. In dBASE IV, open the encrypted EMPLOYEE.DBF
file and go into the BROWSE mode. In this mode, we can test for reading,
extending, updating and deleting of records within the database. If the
Store Manager's privilege scheme has successfully passed, we go to the dot
prompt and type LOGOUT. This will allow us to login the next user profile
for testing.
Continue testing the rest of the users for their user profiles and
privilege schemes against the Security Request Forms by repeating the above
procedure. Upon completion of the testing, you can erase the old .DBF and
.DBT files from the system and leave the encrypted files for use in dBASE
IV.
Protect Tips to Keep in Mind
Two types of files are encrypted during the PROTECT session:
database and memo files.
Index files are only encrypted when you REINDEX or create them with
an encrypted database.
Record your Administrator Password! There is no way of recovering
this information if forgotten.
Use the Security Request Forms or a form of your own design to
record user profiles and privilege schemes. A Protected system must be
maintained and regularly updated. These Forms will serve as your
blueprint to the system. Also keep copies of the reports printed from
the Reports Menu in PROTECT.
If you erase the Dbsystem.DB file after your database files have
been encrypted, you wont see the login screen upon entering dBASE IV.
However, when you try to open one of the encrypted database files, you will
get the error message "Database encrypted" and you won't have access to
that file. Secure your Dbsystem.DB files. Make a backup copy.
Backup your encrypted files frequently. If your files are
important enough to be encrypted, then back them up often. This is true of
your unencrypted files. Having current backups will save you time and can
eliminate potentially many hours of recreating your data.
Protect Commands and Functions
LOGOUT
Logs out the current user and presents a new login screen to allow another
user to login. LOGOUT closes all open database files, associated files and
program files.
SET ENCRYPTION ON/OFF
Default is OFF. In a protected system SET ENCRYPTION is ON. With SET
ENCRYPTION OFF an encrypted file can be copied to a decrypted form. This
is usually done to export a file to another application, to make a backup
copy of the file or to use it in a non-protected dBASE IV system.
ACCESS()
Returns the access level in a numeric form of the current user. If a user
has an access level of zero, they can not access any encrypted files. In a
non-protected system, ACCESS() always returns a zero.
USER()
Returns a string containing the user name of the current logged-in user.
The End of Our Saga
Now all is well with the store. In fact, they're getting ready to become a
chain. Great big warehouse type stores with play areas for the kids and a
cafeteria that serves microwaved entrees like Swedish Meatballs. Customers
will come from all over to get great deals on cheap furniture. And as they
rake in the major bucks, they are secure in the knowledge that all their
hard work is under the safe keeping of the dBASE PROTECT utility. The
End.
5 Q&A
Q&A
"Summed Summaries" Exception
Order of Precedence with calculated fields throws a curve ball for some.
I'm so confused. I tried to use the method described in "Summed Summaries"
(TechNotes/dBASE IV, January 1991) but got inaccurate results. I created a
named summary field in the Group Summary band with the "SUM" operator. I
didn't declare it as a "hidden" field because I wanted to see the summary
result. I then created a hidden named calculated field in the same band,
using the name of the summary field as the expression for the calculated
field. Then I put an "Average" type summary field in the Report Summary
Band, summarizing on the hidden calculated field that I had just created in
the Group Summary Band. Doing a spot check of one group, my averages were
lower than expected. What happened?
You got caught in the "order of precedence" squeeze. Since the Group
Summary Band as a named hidden calculated field and a named visible summary
field, the hidden field value is calculated first as both fields are
named. This information is summarized on page 10-20 of Using the Menu
System.
To solve this problem, reverse the status of the two hidden fields in the
Group Summary Band. Make the summary field hidden and "unhide" the
calculated field. This will force the report generator to calculate the
summary field first, followed by the calculated field. The calculated
field will display correctly, so the average in the Report Summary Band
will not be accurate.
Dynamic Colors
How do I allocate colors dynamically in a program? I want to be able to
use memory variables to supply colors for various portions of my
programs.
You can use macro substitution in most cases:
@ 5,5 SAY "Hello world!" COLOR &ClrInfo
However, you cannot use this technique in a file that you SET FORMAT TO.
Macro substitution is not allowed in the region of a .FMT file that has
your @..SAYs and @..GETs.
Double Your Data, Double Your Fun
I have a database where often times the data in one set of fields will be
the same as the data in another set of fields. For example, one set of
fields is for the mailing address of a customer and has the name of the
person who placed an order, and the other set of fields is for billing
information and has the name of the purchasing agent. A lot of times much
of this information is the same. Is there a simple way to have the
information in the first set of fields automatically copied to the second
set during data entry?
If you set up your format screen to be a two page format screen, there is
an extremely simple way. You place the first set of fields on the first
page, and the second set of fields on the second page. For the second set
of fields you go to the Edit options: Default value and enter the names of
the fields from the first set of fields that you want to copy. The catch
is that the fields must be on separate pages for you to do this.
A Unique Problem
If I create an index with unique keys using the Control Center or the dot
prompt, I don't get unique records when I choose the Include Indexes option
in my queries and choose to sort on the field or index expression that is
supposed to have a unique key or conditional index. Finally, sometimes I
don't see some of the tags that I know are part of my database in the Query
generator.
The problem with using an index with unique keys is related to how the
index is created. In order for it to work properly, you must create the
index from the dot prompt with the following commands:
SET UNIQUE ON
INDEX ON TAG
SET UNIQUE OFF
This is the only way to create an index tag that the Query generator will
use as a unique index. In addition, you should create an index expression
such that it is not just the name of your field. In other words, instead
of a field named Lastname being your entire index expression, you should
use Lastname + "" as the index expression. This applies to the situations
where you don't see indexes you know you've created or where certain
conditional indexes aren't filtering out records.
Commands that Take a Back Seat
When I use the command
BROWSE FORMAT FIELDS Lastname, Firstname, Phone
I get more than just these three fields I specified. Why is dBASE IV
ignoring the FIELDS part of the BROWSE command?
The FORMAT clause of the BROWSE command is a great way to add data
validation and formatting to your applications that use BROWSE. When you
use this option, it prioritizes over the FIELDS option. If your format
screen has ten fields defined it, for example, whenever you have that
format screen active and you issue the BROWSE FORMAT command, you will see
all ten of the fields defined in your format screen in your BROWSE screen,
regardless of whether you use the FIELDS clause. If you only want to see
three fields in BROWSE using picture functions and validation, you need to
create a format screen with only those three fields in it, then you can SET
FORMAT to that screen and use the BROWSE FORMAT. This is as designed and
is mentioned in Language Reference.
The Build Bone's Connected to the Print Bone...
When I run the BUILD utility, I can only access the Exit menu? Do I need
to re-install?
The problem is usually that your CONFIG.DB file has a line in it that is
more than 95 characters in length. This is commonly a printer definition
line. Trim it down by removing extra spaces or shortening the name of the
printer. Try BUILD again and it should work.
One Man's Bug is Another Man's Feature
When I have my TYPEAHEAD buffer set to 0, dBASE IV ignores my ON KEY LABEL
definitions. Is this a bug?
This "problem" has to do with how ON KEY LABEL is implemented. It has to
push something onto the keyboard stack and is unable to do this if you SET
TYPEAHEAD TO 0. So think of it as a way to temporarily disable your ON KEY
LABELS without having to "un-define" them and later redefining them. So,
from one perspective, it could be considered anomalous. But there's
probably someone out there thinking, "I was wondering how I could
temporarily disable my ON KEY LABEL commands."
6 Using Lotus Files
Working with Lotus Files
Kevin Ng
When considering a Lotus spreadsheet, as in most software that has been
around for a while, one must consider the evolution of its formats. The
nuances of each version usually show up by way of a modified extension from
the previous version. For instance, Lotus Release 1a creates files with
.WKS extensions, Release 2.x creates .WK1 files and Release 3 type .WK3.
Thankfully, Release 2.x and Release 3 both include a translating utility,
TRANS.EXE, for translating a file to several different formats for downward
compatibility.
Direct File Transfer
There are two commands in dBASE IV or dBASE III PLUS to use with Lotus
files. They are the APPEND and IMPORT commands. What are the
differences? When should one use APPEND or IMPORT?
In dBASE IV, you can find IMPORT FROM in the Tools Menu of the Control
Center. APPEND FROM is in the APPEND Menu of a MODIFY STRUCTURE session.
Both APPEND FROM and IMPORT FROM are commands accessible from the dot
prompt or within a dBASE program.
IMPORT creates a database structure when invoked and can be used where no
dBASE database structure exists. This menu option/command is particularly
useful when you have in your possession only the .WK1 file itself and no
Lotus software by which you could access the contents.
APPEND FROM works with the original Lotus version .WKS files and is used
primarily when one has an existing dBASE database structure and the data
from the Lotus file is to be appended to it.
There are times when there is no alternative but to use the APPEND FROM
option, such as when the database you want to add data to already exists
and whose structure must be conformed to. Under those circumstances, the
Lotus TRANS.EXE can be used to translate a Lotus type .WK1 back to a .WKS
file.
For the latest Lotus files, those with .WK3 extensions, it is advisable to
use the Lotus TRANS.EXE utility to translate the Lotus file directly into a
dBASE IIIr file format. Note that dBASE IV can automatically read a dBASE
III file.
Indirect File Transfer
In the case where indirect file transfer is preferred, the APPEND FROM..SDF
command can be used for adding data to the database and the COPY TO..SDF
command can be used for exporting a database file to an ASCII text file.
SDF stands for System Data Format and is often referred to as Fixed Length
Format. SDF is recommended because Lotus uses only SDF for indirect
transfer.
A database structure with a matching number of fields of the appropriate
length must exist before the APPEND FROM..SDF can be used.
In dBASE IV, you can find APPEND FROM is in the APPEND Menu of a Modify
Structure session. For COPY TO, use the EXPORT menu of the TOOLS menu in
the Control Center. Both APPEND FROM and COPY TO are commands accessible
from the dot prompt or within a dBASE program.
A good way to experiment with appending from or copying to an ASCII text
file is to use a test database to try the APPEND FROM and the COPY TO
commands. The dBASE text editor can be used to view the text file by using
the command
MODIFY FILE
Using the IMPORT.WK1 Command
A common problem of IMPORTing FROM a Lotus .WK1 file is the "Two Decimal
Places Only Syndrome". Columns of numbers come in as fields with decimal
places set at 2 only. For example, 12.3456 comes in as 12.35 (note that it
is rounded up).
A Lotus spreadsheet allows "free" format. Formatting the spreadsheet
globally or by a specified range with fixed decimal places is a Lotus
option. To create a database structure through an IMPORT operation, dBASE
IV needs to know the information on decimal places up front. It uses 2 as
a default unless it can determine from the spreadsheet how it was FIXED in
Lotus.
With the decimal places FIXED at 4, 12.3456 will import precisely. FIXED
at 7 and 12.3456 will show up in the spreadsheet as 12.3456000 and will
come in to dBASE IV as 12.3456000 nicely. So specifying a FIXED number of
decimal places in Lotus is advisable before an IMPORT operation in dBASE
IV.
Generic Field Names
A database file created by the dBASE IMPORT command will have generic field
names such as A, B,.Z, AA, AB, and so on. The first row of a spreadsheet
usually contains the titles of each column. This first row is read in as a
record instead of being used as the names of the fields in the newly
created database structure.
Should this title row in the spreadsheet be all charactersan extremely
likely scenario since most titles incorporate lettersthere's also the
negative effect of all the field types in the newly created database
structure being set to CHARACTER. Therefore, it is advisable to remove the
titles for the IMPORT.
To obtain customized field names as the database file was created, the only
alternative is to use the Lotus translate utility (TRANS.EXE) to translate
the .WK1 file to a dBASE III file format.
Missing Blank Row
Refer to Missing Blank Row under APPEND FROM..WKS below.
Blank rows will be retained as blank records if the Lotus translate utility
file (TRANS.EXE) is used to translate a WK1 file to a dBASE III file.
Using APPEND FROM..WKS
Missing First Row
The first row in the spreadsheet is always dropped during the APPEND FROM
operation. In this case, make sure that the first row is only an
unnecessary header or title. In case the first row contains data, use the
INSERT option in Lotus to insert a blank row as the top row of the
spreadsheet.
Missing Blank Row
A blank row in a spreadsheet will be disregarded by dBASE III PLUS and
dBASE IV and will not be brought in. Should it be necessary to bring in a
blank row as a record, then the blank row must not be blank. Enter a blank
space or a 0 into the last cell of the row and it will be brought in.
Missing First Field
The last cell of a row in the spreadsheet must contain data (even a
physical space). If it is empty, the first field in the next record will
be blank.
In the case where there may be many instances where the last cells in the
last column of the spreadsheet are empty, insert a dummy column as the last
column and copy some dummy data into it. This will also save the missing
blank row.
For example, the database has six fields and in the spreadsheet there are
six columns. In the spreadsheet, insert a seventh column and copy some
dummy data into it.
So, when APPENDing a Lotus type .WKS file into a database, it is most
efficient and effective to use the following steps.
Insert a blank row as the top row of the spreadsheet if there are
no title or header as the first row.
Insert an extra column as the last column and fill this dummy
column with some dummy data.
Don't forget to save the changes.
In Conclusion
As the software programs from different software publishers constantly
change, it becomes somewhat of a challenge for each company's program to
accommodate some other popular format. Proprietary rights, trade secrets
on new file formats, and technologies make things even more difficult. But
most companies make the effort. Sometimes the translation from one format
to another requires a little extra TLC as has been shown here. But, until
that day where everybody is going at the same pace as everybody else in the
software business, having the right information on how import and export
operations work with the least amount of effort will suffice.
7 Self Documenting Forms
Self Documenting Forms
Tom Woodward
Confronting the natural aversion to completing
a process by thoroughly documenting it.
The process of creating a free-standing application in dBASE IV can take a
programmer through a guided tour of different moods and emotions. The
design and implementation of a full-blown database application can be
interesting, and even, depending on your own personal interpretation, fun.
Then comes the sometimes frustrating, yet occasionally rewarding process of
debugging. You then take your finished product to your boss who gives you
the obligatory pat on the back, and then utters the words which drain the
thrill of accomplishment from your blood, and bring dread to you heart:
"Now document this thoroughly so that someone can take over if something
unforeseeable happens to you in the near future."
Documentation: the lowliest task in the development process, a necessary
evil that every programmer needs to tackle before moving on to the next
project. Wouldn't it be great if it could be done for you? If you used
the application generator to create the skeleton of your application, you
can use DOCUMENT.GEN to generate documentation automatically, but what
about all those involved screen formats, and those tricky reports. Well,
by modifying FORM.GEN and REPORT.GEN, we can have self-documenting form and
report generators to go along with out existing self-documenting menu
generator.
In the interest of conserving space and time, we are only going to cover
how to create a self-documenting screen form, along with a solemn promise
from the author to tackle the report documentor in the not too distant
future issue.
The increased complexity of the dBASE IV format files brings up the
necessity for in depth documentation. Validation and conditional edit
clauses, default value and calculated field expressions, edit prompt and
unaccepted value messages, as well as the screen number (for multi-screen
formats) and screen position of the field all need to be documented for
easy review.
To create the form documentor we simply need to declare the extra variable
names that our routine uses. Just insert these at the top of FORM.COD
after the existing "var" statement (it ends after the variable declaration
terminated with a ";") as follows:
//
// Documenting procedure variables
//
var
parse str, // String to parse during line formatting
temp str, // Temporary string used for parsing
test1, // Temporary variable use in line formatting
remain str, // Text left over from formatting procedure
last at, // Position where left off last line formatted
cntr1, // Counter variable
breakpt1, // Point in string where string is broken for
eoln
breakpt2, // Point in string where string is broken for eoln
first time, // First time through loop flag
first, // First time through loop flag
first test; // First time through loop flag
After that's done we just insert the lines of code at the top of the body
of FORM.COD which will call the documenting procedures:
//
// Call documenting procedures
//
if !make Fmt doc() then goto nogen
doc header() // Print Header in the Doc file
fmt doc body() // Create Documentation
Note: Because the form documenting version of FORM.GEN is somewhat slower
in generating, you may want to make a copy of the normal FORM.GEN before
overwriting it with the documenting version. The non-documenting
generator could be used during development, then when all the bugs and
design issues have been resolved you can use the documenting generator.
Then, at the bottom of FORM.COD we need to add the documenting procedure
which is as follows:
//---------------------------------------------
// Documenting procedure definitions
//---------------------------------------------
//----------------------------------------------
define fmt doc body()
testing = line cnt = wnd cnt = 0
foreach ELEMENT k
//
if ELEMENT TYPE == @FLD ELEMENT then
new page(k)
if FLD FIELDTYPE == memvar then}
Variable Name:{space(indent no - 14)}{cap first(FLD FIELDNAME)}
{ else}
Field Name:{space(indent no - 11)}{cap first(FLD FIELDNAME)} \
{ if FLD FIELDTYPE == calc then}
(Calculated)
{ else}
{ endif}
{ endif}
Field Type:{space(indent no - 11)}\
{ case FLD VALUE TYPE of
case type char:}
Character
{ case type date:}
Date
{ case type float:}
Floating Point
{ case type bool:}
Logical
{ case type memo:}
Memo
{ otherwise:}
Numeric
{ endcase
if Ok Template(k) then
remain str = ""}
Picture Clause:{space(indent no - 15)}\
{wrap string(FLD TEMPLATE,width of wrap,0,remain str);}
{remain str}
{ else}
Picture Clause:{space(indent no - 15)}None
{ endif}
{ print( replicate( "-",78) + crlf);}
{ if FLD PICFUN then}
Picture Functions:{space(indent no - 18)}{get functions(k);}
{ print( replicate( "-",78) + crlf);}
{ endif
if FLD FIELDTYPE == calc then
if FLD DESCRIPT then
remain str = ""}
Description:{space(indent no - 12)}\
{wrap string(FLD DESCRIPT,width of wrap,0,"");}
{remain str}
{ endif}
Expression:{space(indent no - 11)}\
{ last at = 0
remain str = ""
foreach FLD EXPRESSION cntr1 in k}
{wrap string(FLD EXPRESSION,width of wrap,last at,remain str);}\
{ next}
{remain str}
{ print( replicate( "-",78) + crlf);}
{ endif
if chr(FLD VALUE TYPE) == "M" && FLD MEM TYP then
if wnd cnt < 20 then ++wnd cnt endif}
Window Name:{space(indent no - 12)}Wndow{wnd cnt}
Window Coordinates:{space(indent no - 19)}{Doc Box(k)}
{ print( replicate( "-",78) + crlf);}
{ endif
if FLD L BOUND or FLD U BOUND then}
Valid Range:{space(indent no - 12)}\
{ if FLD L BOUND then}
From: \
{ last at = 6
remain str = ""
foreach FLD L BOUND cntr1 in k}
{wrap string(FLD L BOUND,width of wrap,last at,remain str);}\
{ next}
{remain str}
{ endif
if FLD U BOUND then
if !FLD L BOUND then}
To: \
{ else}
{space(indent no)}To: \
{ endif
last at = 6
remain str = ""
foreach FLD U BOUND cntr1 in k}
{wrap string(FLD U BOUND,width of wrap,last at,remain str);}\
{ next}
{remain str}
{ endif}
{ print( replicate( "-",78) + crlf);}
{ endif
if FLD OK COND then}
Accept Value When:{space(indent no - 18)}\
{ last at = 0
remain str = ""
foreach FLD OK COND cntr1 in k}
{wrap string(FLD OK COND,width of wrap,last at,remain str);}\
{ next}
{remain str}
{ if FLD REJ MSG then
remain str = ""}
Unaccepted Message:{space(indent no - 19)}\
{wrap string(FLD REJ MSG,width of wrap,0,"");}\
{remain str}
{ endif}
{ print( replicate( "-",78) + crlf);}
{ endif // FLD OK COND
if FLD ED COND then}
Edit Value If:{space(indent no - 14)}\
{ last at = 0
remain str = ""
foreach FLD ED COND cntr1 in k}
{wrap string(FLD ED COND,width of wrap,last at,remain str);}\
{ next}
{remain str}
{ print( replicate( "-",78) + crlf);}
{endif
if FLD DEF VAL then}
Default Value:{space(indent no - 14)}\
{ last at = 0
remain str = ""
foreach FLD DEF VAL cntr1 in k}
{wrap string(FLD DEF VAL,width of wrap,last at,remain str);}\
{ next}
{remain str}
{ print( replicate( "-",78) + crlf);}
{ endif
if FLD HLP MSG then
testing = 1
remain str = ""}
Help Message:{space(indent no - 13)}\
{wrap string(FLD HLP MSG,width of wrap,0,"");}
{remain str}
{ print( replicate( "-",78) + crlf);}
{ testing = 0
endif
if !FLD EDITABLE or FLD CARRY}
Attributes:{space(indent no - 11)}\
{ if !FLD EDITABLE then}
Read Only\
{ if FLD CARRY then}
, \
{ endif}
{ endif
if FLD CARRY}
Carried Forward\
{ endif}
{ print( replicate( "-",78) + crlf);}
{ endif}
Position:{space(indent no - 9)}Row: {ROW POSITN - line cnt} \
Column: {COL POSITN} Screen No.: {page cnt}
{ print( replicate( "=",78) + crlf);
print( replicate( "=",78) + crlf);}
{ endif
++cnt;
next k
return;
// eof - fmt file body()
enddef
//--------------------------------------------------------------
define get functions(c)
if c.FLD PICFUN then
first = 1
if at("Z", c.FLD PICFUN) then}
Leave blank when value is Zero
{ first = 0
endif
if at("L", c.FLD PICFUN) then
if first then}
Pad number with leading zeroes
{ first = 0
else}
{space(indent no)}Pad number with leading zeroes
{ endif
endif
if at("$", c.FLD PICFUN) then
if first then}
Display number in financial format
{ first = 0
else}
{space(indent no)}Display number in financial format
{ endif
endif
if at("^", c.FLD PICFUN) then
if first then}
Display number in exponential format
{ first = 0
else}
{space(indent no)}Display in number exponential format
{ endif
endif
if at("C", c.FLD PICFUN) then
if first then}
Follow positive credits with "CR"
{ first = 0
else}
{space(indent no)}Follow postive credits with "CR"
{ endif
endif
if at("X", c.FLD PICFUN) then
if first then}
Follow negative debits with "DB"
{ first = 0
else}
{space(indent no)}Follow negative debits with "DB"
{ endif
endif
if at("(", c.FLD PICFUN) then
if first then}
Put parentheses around negative numbers
{ first = 0
else}
{space(indent no)}Put parentheses around negative numbers
{ endif
endif
if at("T", c.FLD PICFUN) then
if first then}
Trim trailing blanks from expression
{ first = 0
else}
{space(indent no)}Trim trailing blanks from expression
{ endif
endif
if at("B", c.FLD PICFUN) then
if first then}
Left align expression
{ first = 0
else}
{space(indent no)}Left align expression
{ endif
endif
if at("I", c.FLD PICFUN) then
if first then}
Center align expression
{ first = 0
else}
{space(indent no)}Center align expression
{ endif
endif
if at("A", c.FLD PICFUN) then
if first then}
Allow input of alphabetic characters only
{ first = 0
else}
{space(indent no)}Allow input alphabetic characters only
{ endif
endif
if at("!", c.FLD PICFUN) then
if first then}
Convert entry/expression to uppercase
{ first = 0
else}
{space(indent no)}Convert entry/expression to uppercase
{ endif
endif
if at("R", c.FLD PICFUN) then
if first then}
Literals in template are not part of data
{ first = 0
else}
{space(indent no)}Literals in template are not part of data
{ endif
endif
if at("S", c.FLD PICFUN) then
if first then}
Scroll within a display width of: {c.FLD PIC SCROLL}
{ first = 0
else}
{space(indent no)}Scroll within a display width of: {c.FLD PIC SCROLL}
{ endif
endif
if at("M", c.FLD PICFUN) then
if first then}
Selection to be made from the following choices:
{ first = 0
else}
{space(indent no)}Selection to be made from the following choices:
{ endif
last at = 0
remain str = ""
first test = 1
foreach FLD PIC CHOICE cntr1 in c
if first test then}
{space(indent no)}{wrap string(FLD PIC CHOICE,width of wrap,last at,remain
str);}\
{ first test = 0
else}
{wrap string(FLD PIC CHOICE,width of wrap,last at,remain str);}\
{ endif
next}
{remain str}
{ endif
endif
enddef
//--------------------------------------------------------------
define make fmt doc()
// Attempt to create document file.
default drv = strset( defdrive) // grab default drive from dBASE
fmt name = FRAME PATH + NAME // Put path on to object name
if not fileok(fmt name) then
if !default drv then
fmt name = NAME
else
fmt name = default drv + ":" + NAME
endif
endif
fmt name = upper(fmt name)
if not create(fmt name+".DOC") then
pause(fileroot(fmt name) +".DOC" + read only + any key)
return 0;
endif
return 1;
enddef
//--------------------------------------------------------------
define doc header()
// Print Header in program
print( replicate( "=",78) + crlf);
print( replicate( "*",78) + crlf);}
*
*-- Data Description For: {filename(fmt name)}FMT
*-- Date................: {ltrim( substr( date(),1,8))}
*-- Version.............: dBASE IV, Format {FRAME VER}.1
*
{ print( replicate( "*",78) + crlf);
print( replicate( "=",78) + crlf);}
{
enddef
//--------------------------------------------------------------
define doc box(cur) // Pass in foreach cursor
// Build box coordinates for a dBASE window command
var result, temp page, line cnt;
temp page = page cnt;
// Adjust box coordinates so that negative numbers are not
generated
do while ( nul2zero(cur.BOX TOP) - (scrn size * temp page) ) <= 1
temp page = temp page - 1
enddo
if page cnt == 1 then
temp page = 0
endif
if page cnt == 2 then
temp page = 1
endif
if !temp page then
line cnt = 0
else
line cnt = (scrn size * temp page) + (1 * temp page)
endif
result = nul2zero(cur.BOX TOP) - line cnt + ", "
result = result + nul2zero(cur.BOX LEFT) + " To: "
temp = nul2zero(cur.BOX TOP) + cur.BOX HEIGHT - line cnt - 1
if temp > scrn size then temp = scrn size endif
result = result + temp + ", " + (nul2zero(cur.BOX LEFT) + cur.BOX
WIDTH - 1)
"From: " + result
enddef
//--------------------------------------------------------------
define wrap string(string2wrap, line width, last at, left over)
if len(left over) + len(alltrim(string2wrap)) > 237 then
parse str = left over + substr(alltrim(string2wrap), 1,
len(alltrim(string2wrap)) - len(left over))
left over = substr(alltrim(string2wrap),
len(alltrim(string2wrap)) - len(left over) + 1)
else
parse str = left over + alltrim(string2wrap)
left over = ""
endif
first time = 1
do while len(parse str) + len(left over) > line width - last at
breakpt1 = 0
temp str = parse str
do while breakpt1 + at(" ", temp str) < line width - last
at && at(" ", temp str)
breakpt1 = breakpt1 + at(" ", temp str)
temp str = substr(temp str, at(" ", temp str) + 1)
enddo
breakpt2 = 0
temp str = parse str
do while breakpt2 + at(")", temp str) < line width - last
at && at(")", temp str)
breakpt2 = breakpt2 + at(")", temp str)
temp str = substr(temp str, at(")", temp str) + 1)
enddo
if !breakpt1 || breakpt1 > breakpt2 then
if !breakpt1 then
if first time then}
{alltrim(substr(parse str, 1, line width - last at))}
{ first time = 0
parse str = alltrim(substr(parse str,
(line width - last at) + 1, len(parse str) - (line width - last at)))
last at = 0
else}
{space(indent no) + alltrim(substr(parse str, 1, line width))}
{ parse str = alltrim(substr(parse str, line width + 1,
len(parse str) - line width))
endif
else
if first time then}
{alltrim(substr(parse str, 1, breakpt1))}
{ first time = 0
last at = 0
else}
{space(indent no) + alltrim(substr(parse str, 1, breakpt1))}
{ endif
parse str = alltrim(substr(parse str,
breakpt1 + 1, len(parse str) - breakpt1))
endif
else
if first time then}
{alltrim(substr(parse str, 1, breakpt2))}
{ first time = 0
parse str = alltrim(substr(parse str,
breakpt2 + 1, len(parse str) - breakpt2))
last at = 0
else}
{space(indent no) + alltrim(substr(parse str, 1, breakpt2))}
{ endif
parse str = alltrim(substr(parse str, breakpt2 + 1,
len(parse str) - breakpt2))
endif
if left over then
if len(left over) + len(parse str) <= 237 then
parse str = parse str + left over
left over = ""
endif
endif
enddo
if first time then}
{alltrim(parse str)}\
{ else
breakpt1 = 0
temp str = alltrim(parse str)
do while at(" ", temp str)
breakpt1 = breakpt1 + at(" ", temp str)
temp str = substr(temp str, at(" ", temp str) + 1)
enddo
breakpt2 = 0
temp str = alltrim(parse str)
do while at(")", temp str)
breakpt2 = breakpt2 + at(")", temp str)
temp str = substr(temp str, at(")", temp str) + 1)
enddo
if breakpt1 >= breakpt2 then
if !breakpt1 then
last at = 0
remain str = parse str
test1 = ""}
{space(indent no)}\
{ else}
{space(indent no) + substr(parse str, 1, breakpt1)}\
{ if len(parse str) > breakpt1 then
remain str = substr(parse str, breakpt1
+ 1)
endif
last at = breakpt1
endif
else}
{space(indent no) + substr(parse str, 1, breakpt2)}\
{ if len(parse str) > breakpt2 then
remain str = substr(parse str, breakpt2 +
1)
endif
last at = breakpt2
endif
endif
enddef
// EOF: Form.cod
}
To see what this auto-documentor can do, let's assume you've created a
screen for the file "Orders" which has the following fields:
Part_Name, Part_ID, Date_Order, Discontinu, Cost, Qty_2Order, Qty_OnHand,
Comments, Terms, Discount.
After compiling the documenting .COD file and making it the default
template for form generation, resaving the form will create a file with
same name as your screen form file and an extension of .DOC. A sampling of
this printout is shown on the following page.
==============================================================================
******************************************************************************
*
* Data Description For: ORDERS.FMT
* Date................: 3-31-91
* Version.............: dBASE IV, Format 1.1
*
******************************************************************************
==============================================================================
Field Name: Part name
Field Type: Character
Picture Clause: XXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
Help Message: Enter the Part's Name
Position: Row: 2 Column: 12 Screen No.: 1
==============================================================================
==============================================================================
Field Name: Part id
Field Type: Character
Picture Clause: 99-9999-9999
Picture Functions: Literals in template are not part of data
Help Message: Enter the Part Number
Position: Row: 2 Column: 58 Screen No.: 1
==============================================================================
==============================================================================
Field Name: Date order
Field Type: Date
Picture Clause: None
Help Message: Enter the Date of Order
Attributes: Carried Forward
Position: Row: 3 Column: 12 Screen No.: 1
==============================================================================
==============================================================================
Field Name: Discontinu
Field Type: Logical
Picture Clause: Y
Position: Row: 4 Column: 58 Screen No.: 1
==============================================================================
==============================================================================
Field Name: Cost
Field Type: Numeric
Picture Clause: 999999.99
Picture Functions: Display number in financial format
Help Message: Enter the Cost of this part
Position: Row: 6 Column: 12 Screen No.: 1
==============================================================================
==============================================================================
Field Name: Qty 2order Field Type: Numeric Picture
Clause: 9999
Edit Value If: .NOT. DISCONTINU
Help Message: Enter the quantity that is to be ordered for
this part
Position: Row: 6 Column: 39 Screen No.: 1
==============================================================================
==============================================================================
Field Name: Qty onhand
Field Type: Numeric
Picture Clause: 9999
Help Message: Enter the Quantity that we have on hand for
this part
Position: Row: 6 Column: 58 Screen No.: 1
==============================================================================
==============================================================================
Field Name: Discount Field Type: Numeric Picture
Clause: 99
Accept Value When: DISCOUNT = IIF( Lead Time > 120, 0, IIF(
Lead Time <= 120 .AND. Lead Time > 90, 5,
IIF(Lead Time <= 90 .AND. Lead Time > 60,
10, IIF( Lead Time <= 60 .AND. Lead Time >
30, 15, 20 ))))
Unaccepted Message: Lead Time
>120:Dis=0;<=120,>90:Dis=5;<=90,>60:Dis=10;<=6
0,>30:Dis=15;<=30:Dis=20
Help Message: Enter the Percentage Discount we will
realize depending on lead time.
Position: Row: 11 Column: 66 Screen No.: 2
====================================================================================================================================================
These articles are reprinted from the July 1991 edition of TechNotes/dBASE
IV. Due to the limitations of this media, certain graphic elements such as
screen shots, illustrations and some tables have been omitted. Where
possible, reference to such items has been deleted. As a result,
continuity may be compromised.
TechNotes is a monthly publication from the Ashton-Tate Software Support
Center. For subscription information, call 800-545-9364.
2 ABCs of UDFs
ABCs of UDFs
Martin Leon
Almost everyone has heard about User Defined Functions. If you've read
TechNotes for any length of time, then you've probably seen a number of
UDFs provided in the UDF Library column. You may have heard that they
provide a great enhancement to the already very versatile dBASE language.
But do you need UDFs? Do you know how to make and implement your own
UDFs? This article will provide some basics as well as some tips on using
UDFs most effectively.
dBASE IV Functions
dBASE IV comes loaded with functions. Think of functions as messengers
that take your question and always bring back a response, a return value.
For example, you ask dBASE IV what today's date is by using the DATE()
function:
? DATE()
the value it returns is a date type variable. This function does not
require any arguments to provide you with the information you are asking
for.
"Is this the right room for an argument?"
"Is not!", "Is too!", "This is just a contraction. This isn't an
argument.", "...can be.", "Is NOT!", "It certainly is..."
Although this may be the type of argument you are used to, when it comes to
a UDF, an argument takes on a more constructive roll. Think of arguments
as additional pieces of information used to convey to a dBASE function
exactly what it is that you are asking for. Suppose you want to know what
day of the week a particular date falls on. You would use the CDOW()
function. This function needs to be told what date to use to be able to
tell you what day of the week it is. It expects a date-type argument and
returns a character string representing the day of the week ("Sunday"
through "Saturday") that date falls on. To find out what day of the week
today is, enter the following at the dot prompt:
? CDOW(DATE())
Some functions require more than one argument to be able to do their
defined tasks. For example, the MAX function will tell you which of two
arguments is greater.
? MAX(1, 100)
would return a value of 100. Obviously, if you ask dBASE IV to compare two
values, you must provide two arguments for it to compare.
dBASE IV has a very extensive set of commands and functions, but even if it
had twice as many, there would always come an instance where there may not
be a precise function for what you want to do.
UDF Definition
We'll start with a simple example. Let's say you want a function that will
tell you how much sales tax will be added to a particular purchase.
Figuring out tax is such a simple operation that it doesn't really require
a UDF, but we'll use it to show the basic elements of a UDF. The sales tax
in this neck of the woods works out to 7%. Defining a function is easy.
If you want to use the dBASE editor, from the dot prompt, enter
MODIFY COMMAND TAX
The first executable line of this UDF would be:
FUNCTION Tax
This command tells dBASE IV that you are about to define a function named
Tax(). The next step is to decide if it needs any arguments. Arguments
are optional but since the amount of tax is dependent on the amount of the
sale, we must use the amount of sale as an argument to the Tax() function.
In other words, if we want our function to tell us how much tax is due, we
must tell it the amount of the sale on which to compute the tax. On the
receiving side, arguments are called parameters. For a function to be able
to receive the information you are passing to it in the form of arguments,
the function must be defined with corresponding parameters. The number of
arguments used when calling the function must match the number of
parameters defined in the function. Otherwise, dBASE IV will tell you that
you called the function with the wrong number of parameters. To define a
parameter, use the PARAMETER command immediately after the FUNCTION
declaration:
FUNCTION Tax
PARAMETER mSale amt
This defines a single parameter called mSale amt. This parameter is a
memory variable that will be initially set to the value we pass as an
argument. Because we want to perform a numeric calculation on the amount
of sale, the argument we use should be a numeric value, as opposed to a
character, date, or logical value. Although you can use other data types
as arguments or return values, this particular function requires a numeric
value. This memory variable/parameter is "local" to the function: no other
program or function can "see" it or modify it. Also, all parameters and
all other memory variables defined in a function are automatically released
when the function is done with its task. Of course, memory variables that
have previously been declared public are an exception.
Whether you use any parameters or not is up to you, but if you define
parameters, you must call the function with the corresponding number of
arguments. A parameter cannot be optional in a UDF like it sometimes can
in a regular dBASE function. If your UDF calls for it, you must pass
something. You can pass a null string ("") or 0 to indicate to your UDF a
certain condition.
Finally, since this is such a simple calculation, the last step is to send
back the amount of sales tax.
FUNCTION Tax
PARAMETER mSale amt
RETURN mSale amt * .07
The RETURN command provides a special function in a UDF in that it can
relay a response from the calculation performed therein. Any expression
in the RETURN command is evaluated first and the result is what is sent
back as the answer. In a regular program or procedure, the RETURN command
does not allow any other arguments that return a value. In this case,
however, the RETURN command figures out the amount of tax due using the
formula we gave and sends the result back as it's response to the question
"How much tax on this amount?"
Suppose we want to make this UDF more generic, more flexible. Let's make
it so that we need to provide both the amount of sale and the amount of tax
two arguments. The reason this is more flexible is that we don't need to
modify the UDF if we have to deal with a different tax rate you just
change the value of the argument that provides the tax rate. Also, let's
suppose we don't want to figure out the decimal version of the tax
percentage we want to just say 7 as the percentage. Then we need to also
tell the UDF to do the conversion for us:
FUNCTION Tax
PARAMETERS mSale amt, mTaxperct
RETURN mSale amt * (mTaxperct / 100)
Now when we call this function, the value of the first argument is stored
in mSale amt, and the value of the second argument is stored in mTaxperct.
The RETURN command divides the value of mTaxperct by 100 and multiplies
that result by mSale amt. The final result is what is sent back as the
reply.
The result of this function would be a return value of 0.7.
? Tax(10, 7)
In short, this is all there is to a UDF. You define it, define its
parameters (if any), do some processing and return a value. In this simple
UDF, the RETURN command does the processing and returns the resulting
value. It evaluates the mathematical expression and sends its result back
as the answer. Let's look at a UDF that would require more than just three
lines of code.
The DBFSize UDF will tell us the exact size of the currently active
database file. In order to do this we need to determine the size of the
database header and then add to that the result of multiplying the size of
each record by the number of records.
Database size = Size of database header + (# of records * size of each record)
There are dBASE IV functions that tell us the number of records and the
size of the records, but none that will tell us the size of the header.
The size of the header is figured by multiplying the number of fields by 32
and adding 34 to the result.
Database header size = (# of fields * 32) + 34
There isn't a dBASE function that will tell us how many fields there are in
a database, so it's going to take more than a simple expression to do all
this. Without getting in to a lesson in programming, the UDF on the next
page accomplishes this.
FUNCTION DBFSize
mFcount = 0
* This loop counts the fields in the database.
DO WHILE "" # FIELD(mFcount + 1)
mFcount = mFcount + 1
ENDDO
mHeadsiz = 32 * mFcount + 34
RETURN mHeadsiz + (RECSIZE() * RECCOUNT())
We don't use any parameters with this function because we want it to work
with whatever database is currently in use. We could, if we wanted, define
a parameter to receive a database name and to give us the size of that
particular database. Try to incorporate that change yourself for practice.
Notice that you can use dBASE IV functions and programming structures into
your own functions. In fact, you can even incorporate other UDFs into your
UDFs.
For those of you familiar with dBASE IV procedures, there's basically only
one difference between a UDF and a procedurea UDF must return a value,
even if that value is 0 or a null string. One major advantage of a UDF
over a procedure is that you can call a UDF any place that you could call a
dBASE functionthere are no DO commands involved. It's just like extending
the dBASE language.
Accessibility of UDFs
Okay, so now you've created a UDF, but how does dBASE IV know that you've
created one? There are several ways. One option is that you can create a
library of functions and SET PROCEDURE to that library to have access to
all of the functions in that library. A library of functions would be a
single program file with several functions defined in it. By issuing the
command
SET PROCEDURE TO
you would have access to all of the functions defined in that file. This
file can be created using MODIFY COMMAND at the dot prompt or we could
create a dBASE Program (not an application) from the Application panel in
the Control Center. If you already have a number of UDFs scattered about
in your directory and would like to consolidate them under one file, open a
new file using MODIFY COMMAND and use the shortcut, Ctrl-K R to read in the
different files. Within this procedure, all your UDFs claim their own
territory by their opening and closing statements which will always start
with the command FUNCTION and end with the command RETURN with a value.
Otherwise, you can type in your functions, one beneath the other as you are
inspired to create them.
For example, you could create a dBASE program called MyFunc that has all of
your UDFs in it:
*Myfunc.PRG
FUNCTION Tax
RETURN
FUNCTION DBFSize
RETURN
FUNCTION SomeOthr
RETURN
* and so on with all of your functions
* End of Myfunc.PRG
After saving it, you would go to the dot prompt or to some other program
where you need these functions and type:
SET PROCEDURE TO Myfunc
If everything compiles fine, you will now have access to any of the UDFs in
Myfunc. The problem with this method is that you must remember to use the
SET PROCEDURE command to be able to use your functions.
Another option is to put the UDF in a program file of the same name. For
our Tax() example, we would create a program (from the dot prompt or the
Control Center) called Tax. It would contain the definition for a single
function, the Tax() function. Whenever you make a call to the Tax()
function, dBASE IV will first look to see if it has such a command itself.
Verifying that it doesn't, dBASE IV will next check to see if the current
PROCEDURE file has this function defined in it somewhere. If that's not
the case, dBASE IV scans the current directory for the file TAX.DBO (the
compiled version of the .PRG. If it hasn't been compiled, dBASE IV will
compile TAX.PRG automatically). Upon finding it, dBASE IV loads it in to
memory and runs it. Although this method frees us from having to remember
to SET PROCEDURE, it is technically slower because it will always involve
hard disk access time the first time you use the function and may require
additional disk access for successive calls to the function. This
difference in speed may be negligible. However, when developing a UDF,
this method may be easier since you will only have one UDF in the file and
won't have to go hunting for the UDF you are trying to perfect. You decide
which method you like better.
For programmers there is an additional option: put the function definition
in your main routine and it will be accessible to all sub-routines and
procedures and to other UDFs.
Once you implement any of these options, your UDF is available to dBASE IV
and could potentially be used anywhere that a dBASE IV function is used.
Limitations with UDFs
As mentioned earlier, you can use any of the existing dBASE IV commands,
functions, or programming structures in your UDFs. From that perspective,
there is no limit to what can be done in a UDF. What you can do in a UDF
is probably going to be most limited by the context that you try to use it
in.
As you design your UDF, you must remember where it is that you will be
calling this UDF from and how it is that you will be calling it. If you
look at the Tax() example, you'll notice that it basically just performs a
mathematical calculation. There is no need to move a record pointer in a
database, open another database, change work areas, modify the information
on the screen, open or close a window or any other complex operation. It's
a very safe UDF. You could use it in a format screen, a report form, as a
calculated field in a query, or from the dot prompt as part of a LIST
command. If you SET DBTRAP OFF, you could even use Tax() as part of an
index expression.
Warning: If you want to incorporate your UDF in to these other parts of
dBASE IV, bear in mind that you are now sharing control in how dBASE IV
operates. And when you set DBTRAP OFF, you're obliged to be responsible in
your coding or otherwise run the risk of encountering potentially fatal
errors that can hang the program.
Next month, we'll expand the functionality of UDFs by using them in Reports
and Screen Forms.
3 Manual Overwrite
Manual Overwrite
When ? Is not Like @...SAY
Joe Stolz
If you've ever perused the Language Reference, you may have noticed the two
commands located at the very beginning: @.SAY and ?/??. Both of these
commands can display information to the screen, to a printer or to a file.
@.SAY seems most adept at printing information at a specific location on
the screen or on paper since the exact coordinates of the desired text or
field will be used for precise placement on the page. The ? command seems
a poor substitute for @.SAY. However, you may have also noticed that dBASE
IV has enhanced the abilities of ? beyond those of dBASE III PLUS and added
some unusual clauses; for example, AT and STYLE. Just what is going on
with ? and when should one use one over the other?
Let's Review Some Facts
In the days of dBASE II when you practically lived at the dot prompt, the ?
command was one of the best ways to communicate with the dBASE II program.
For example, to perform an equation you would type
? 1 + 1
and dBASE II would return 2. To verify even the most simplistic arithmetic
operation, you might type
? 5 = 5
to which the reassuring response is .T.. The ? command was loosely
translated as "What is" in the documentation.
Perhaps in those days, as today, you were also perplexed about the fact
that two commands SET PRINT ON and SET DEVICE TO PRINT (really, it was SET
FORMAT TO PRINT in dBASE II) existed. Well let's clear up these problems
once and for all and learn something in the process.
In a nutshell, SET DEVICE TO goes with @.SAY and SET PRINT goes with ?
commands. However, where SET DEVICE determines to which device @.SAY
commands will be directed, SET PRINT effects a wide range of commands,
directing them all to the printer if you SET PRINT ON. Commands that will
be printed are ones that normally display to the screen in a scrolling
fashion like LIST, DISPLAY MEMORY and any command that echoes to the screen
when SET ECHO is ON.
For formatting purposes, many programmers naturally leaned toward the use
of @.SAY commands because you could specify precise locations on the screen
or paper where you wanted output to appear. Originally, the ? command gave
no such luxury. In dBASE IV, much more octane was added to this
fundamental command. The addition of the AT clause for example, allowed
you to specify the column position where output would appear.
If you dare, view the contents of one of your report .FRG files or label
.LBG files. You will see (if you are programmatically inclined) that all
lines in the report that are sent to screen or printer do so through a
series of ? and ?? commands, often using the AT clause. If @.SAY is so
easy to use and predictable, why are reports using ? commands?
One reason is that a ? command better lends itself to the variable nature
of a dBASE report which could conceivably be the template for a number of
files. @.SAY commands can be directed to the screen in any way desired and
still display a pretty picture. However, since printers cannot back up to
the previous line (they print from top to bottom on a page, a line at a
time), an @.SAY directed to line 10 followed by one to line 4 will cause a
page eject and then a movement to line 4 on the following page. This can
be annoying to programmers but is entirely logical given the abilities of
printers.
Needless to say, @.SAY commands can still do a masterful job at printing a
well laid out page. Probably the overriding factor about reports that
lends them so well to ? type printing is that reports are usually very line
oriented. In a report, each detail record takes a line by itself. If
there are ten detail records, there will be ten lines. This is easily
handled by a series of ten ? commands, or by a DO loop that performs a ?
ten times. This isn't quite the same with @.SAY commands, though they too
can accommodate an unpredictable number of lines/records by referencing
line coordinates with a memory variable rather than a numeric constant.
Still the variable would have the potential to cause a page eject if a
print row was a value less than the previously stated print row. Thus, it
is obvious that using @..SAY would require more maintenance.
Streaming Output
dBASE IV introduced a new type of printing terminology: streaming output.
Streaming output is what is sometimes referred to as linear - line by
line. Streaming output includes ? but not, under any circumstances, does
it include @.SAY output. This causes much chagrin but is an absolute fact
and must be accepted.
The ? commands are well suited to "streaming output" which is naturally
line after line, represented by ? after ? in your report program. It seems
a bit more adapted to unpredictable printing jobs.
Be that as it may, dBASE IV has clearly chosen ? to be the work horse of
printing. Let's look a very fundamental aspect of ? printing and we will
see that ? is undoubtedly the best choice in printing jobs.
So what does streaming output do for you? New advanced printing functions
are available to print jobs that are streaming output oriented. These
include BEGIN PRINTJOB, ON PAGE AT LINE, and system memory variables like
pcopies, pscode, and pecode. A PRINTJOB is a series of commands and
procedures that together make up a print routine. If the variable pcopies
is set to 2 or more, the PRINTJOB will create a spool copy of the routine
and run a quick second copy (or more as needed). PRINTJOBs offer a means
to consolidate all operations that are relevant to printing, including some
that are hardware and environment dependent. Space doesn't permit an
in-depth discussion of print jobs but to re-emphasize the point, PRINTJOB
is for streaming output commands only and @.SAY is not a streaming output
command.
The other important aspect of ? that is unique to it is the STYLE clause
that can be applied. Again, this is a uniquely print-oriented feature.
STYLE offers certain standard type styles that can be applied to the text
that is being output by the ? command. These are Bold, Underline, Italics,
Raised and Lowered (if your printer supports these fonts). Further, each
printer can support five fonts that are unique to the type of printer you
use. STYLE also offers the only means to access these fonts via a command
in dBASE IV. Again, @.SAY doesn't offer this ability when printing either.
As if to totally tip the scale in favor of ? and streaming output,
PostScript must be printed through ? commands and not through @.SAY
commands. Due to the uniqueness of PostScript printing, ? commands
facilitate the process. If @.SAY commands are used to print with the
PostScript driver, gyrations must be performed to get line feeds and form
feeds. This a needlessly difficult way to print when ? makes a PostScript
interface as painless as printing to any other printer type.
That means that ?/?? appears to be the winner when it comes to printing.
If any doubt remains that this is the premier way to print, check out the
Language Reference on the commands discussed and see what they can do for
you!
4 The Theory of Relativity
The Theory of Relativity
(as applied to the PROTECT command)
Loretta A. Steele
Once there was a little mom-and-pop operation specializing in cheap
furniture and was appropriately named The Cheap Furniture Store.
Throughout the first twenty or so years, they fared marginally well. In
the eighties, it seemed that they would have to go belly up. No one was
buying anything cheap. And then the nineties hit and suddenly, cheap
furniture was the in thing. Everybody's belt was so tight, they had to add
extra notches and carry oxygen tanks. It was then that The Cheap Furniture
Store's business flourished. They expanded their staff, computerized their
operation and changed their name to something that sounds Scandinavian.
As fate would have it, their computer operations were managed by dBASE IV.
As fate would also have it, they hired an employee who was a little too
smart for his own good. Before they knew it, they had their own
embezzlement scandal. They lost some money and presumably, some faith in
their fellow man. But times had changed drastically since that memorable
day when they hung their first GRAND OPENING sign. Computers had made
thievery an exquisite art and no one was safe. It became apparent that a
security system was necessary to protect the assets and confidential
information that used to be in a locked file cabinet. The computer age had
reduced all of this to a very vulnerable state. So they called the same
brilliant consultant that set up their system initially, to ask him what
software or hardware they must now purchase to safeguard their computerized
investment. He told them that the capability was already there and that
implementing it would be no problem. (Of course, he was a consultant.)
It is from here, we will explore the considerations necessary for ensuring
the security of their accounting database files. Because of the job
requirements of the different store employees, the database files that make
up the store accounting system must have some level of restricted access.
The questions then concern how the database files are to be protected from
unauthorized access by store employees and how the fields within the
databases files are to be secured.
The job requirements of the furniture store employees in relation to the
accounting database files will be the focus of establishing the protected
accounting system. The makeup of the staff consists of the store manager,
payroll clerks, account receivable clerks, order entry clerks and an
outside accountant. The database files to be protected are: EMPLOYEE.DBF,
ORDERS.DBF, ORDERS.DBT and ACCT REC.DBF. These files are found on the
Samples Disk that is included with the dBASE IV package. As you go through
this tutorial, you can follow along on your system using these files.
The PROTECT utility includes a user log-in security, database file and
field access level security, and data encryption of database, memo, and
index files. You can use PROTECT just for the log-in security without
involving access levels and data encryption. If you choose to use the
access levels and data encryption, PROTECT will always control the access
to the encrypted files.
Privileged Information
The log-in security is a password-protected feature. A user must enter
their correct group name, user name and password before they can enter
dBASE IV. The three items are collectively called the User Profile which
is maintained in the Dbsystem.DB file created and maintained in the dBASE
directory. If a user fails to correctly enter their user profile after
three tries, the PROTECT system will return them to the operating system
prompt. The log-in security is protected regardless of what directory or
drive dBASE IV is invoked from. The Dbsystem.DB file is an encrypted file
so no one can discover its stored user profiles. If erased or damaged, it
can effectively lock you out of ever accessing your data unless you have
unencrypted backups or a reliable backup of the Dbsystem.DB file itself.
The Access Level security controls the usage of the database files and the
fields within those files. Each user is assigned an access level, which
gives them specific privileges or establishes what the user can do with the
file and its fields. These file and field privileges are called the users'
privilege scheme. The first part of the privilege scheme is File access
privileges. These privileges consist of Read, Update, Delete and Extend.
The Read privilege gives the user the right to view records within a
database file. The Update privilege allows the user to edit or change the
contents of records. The Extend privilege gives the user the right to
append or add records to the database file. The Delete privilege grants
the user the right to delete records from a database.
The second part of the privilege scheme is Field Access Privileges. These
privileges are: Full, Read Only (or R/O) and None.
Full privilege authorization means a user can read and write to a field.
If a user has the Read Only or R/O privilege, they can read a field but
can't edit or change the contents of a field. The None privilege means the
user will not be able to view that field and, since it isn't visible, they
can not edit the field.
The final feature of the Protect system is data encryption. The database,
memo and index files can be encoded to prevent the reading of their
contents. Only by logging into a PROTECTed dBASE IV system can users
access and read these files and only through the features of a PROTECTed
system can these files be decrypted and made readable outside of a
non-protected dBASE IV system. The encryption process is automatic once
access level security information is saved.
The Store Manager must have access to all files and fields within all
databases. The Manager will be assigned an user access level of 1 on each
database file. An example of the Manager's user profile for the
EMPLOYEE.DBF file follows:
Group name: STAFF
User name: MANAGER
File name: EMPLOYEE
Password: ALLRIGHT
Access level: 1
Clerical Responsibilities
The Payroll Clerks only deal with the EMPLOYEE.DBF database file. The
majority of the fields in this file are edited by the Payroll Clerks. But
there are a few fields that the Clerks can not change. With this in mind,
a user access level of 4 is assigned to Payroll Clerks. Because of their
restricted access, they obviously cannot be given the same access level as
the Manager. On the other hand, an access level of 8 would be too
restrictive. The number 4 is midway between the least restrictive access
level and the most restrictive level. There is nothing magical about the
number 4. Assigning access levels is based on a sliding scale. An access
level number is only significant in relation to the other access levels
you've set. What we really need to focus on is the Payroll Clerks access
in relation to the Store Manager. It's obvious that the Clerks must not
have the same access as the Manager. In fact, it was real obvious when
that whole nasty embezzlement scandal happened.
We also know that the Clerks cannot be completely denied access to the
EMPLOYEE file. This leaves us with a range of access level numbers from 2
to 7. Any number in this range would work if we are only considering a
Store Manager and Payroll Clerks. But our system will also involve an
Accountant who needs access to the EMPLOYEE file. Since we have a range to
play with, one approach during the initial assignment of access levels is
to choose a mid-range number where possible. This method gives you some
flexibility during your assignments. Don't box yourself in by trying to
assign levels in strict sequential order. Spread the numbers out, give
yourself some breathing room. Always focus on each users needs in relation
to the other users.
The Order Entry Clerks must use the ORDERS.DBF database. They are assigned
a user profile access level of 4. These Clerks will have limited access to
the ORDERS file. However, they must not be completely restricted. An
assignment of 4 gives us leeway, just as in the case of the Payroll
Clerks. We still have an Accountant who needs access to the ORDERS file.
So in relation to the Store Manager's access level of 1 and a very
restrictive level for the Accountant, the Order Entry Clerks can fit midway
on our sliding scale of access levels.
Now for the Accounts Receivable Clerks (A/R Clerks). They must have access
to the ACCT REC.DBF and ORDERS.DBF files. In dealing with the ACCT REC.DBF
file, the A/R Clerks will enter and maintain all fields. Their access
level will be a 5. You might be wondering why their level is only 5.
After all, they must have access to all of the fields. Wouldn't it be more
logical to give the least restrictive level of 1? In an isolated
circumstance, that reasoning would be correct. The A/R Clerks deal with
two database files. We need to make an assignment that will encompass both
files.
In the ORDERS file, the A/R Clerks only need to read the information but
never alter it. This is a very restrictive access. But it is not the most
extreme access. The most extreme access would be no access at all. We
wanted to make one assignment for the A/R Clerks that would reflect their
needs in both the ACCT REC and ORDERS files. If we had given them an
access level of 1 in ACCT REC because they needed to maintain all the
fields in that file that would not have reflected the restrictions placed
on the ORDERS file of read-only. We could go back to our comfortable
midway assignment of 4. That would reflect the needs of the Clerks in both
files but we should give the A/R Clerks a more restrictive assignment. The
Payroll and Order Entry Clerks all have an assignment of 4 but they are not
as restricted in their respective files as are the A/R Clerks with the
ORDERS file. Therefore, the access level of 5 was based on that
reasoning. A/R Clerks need access to all fields in ACC REC but they need
more restrictive use of the ORDERS file. Their access level in both files
will be a 5.
The Outside Accountant
The final user is the outside Accountant. The Accountant must have access
to all the files (EMPLOYEE.DBF, ORDERS.DBF, and ACCT REC.DBF). However,
the access of the fields within those files will have some strong
limitations. The Accountant's access must be more limited than any of the
staff. At this point the most restrictive assignment we may have is an
access level of 5. Therefore, in relation to the Accountant, we have a
range of 6 to 8. The Accountant will have an access level of 7. The
numbers 6, 7, or 8 would have served our purpose but our philosophy is to
spread the numbers out when possible. If 6 will work, so will 7. Access
level assignment is a sliding scale not a strict fixed scale. Always
consider the numbers in relation to the other assignments. Spread the
numbers out, be flexible.
Going by the Book
We're approaching the final stages of creating a PROTECTed database system,
having completed security request forms (a form for this purpose is found
in the dBASE IV manual, page 14-41 of Using the Menu System in the version
1.1 package. In version 1.0, it is found in Networking with dBASE IV, but
really, why haven't you upgraded?) for all users and groups. We've
specified the appropriate user profiles and privilege schemes. What's
left? Only the task of entering this information into the dBASE IV PROTECT
Menu remains. This will create a Dbsystem.DB encrypted file to contain
user profile data and then encrypt our database and memo files.
We will start by backing up any original database and memo files for
safekeeping. Maintaining regular backups are very important but especially
so when working with encrypted files. Should some sort of corruption
render your file in a perpetually locked state, oblivious to passwords, a
reliable backup will be your only option. You must SET ENCRYPTION OFF
before COPYing a file to a backup. Copying an encrypted database through
DOS will not be a reliable backup should your password system become
corrupted.
The next step will be to create a new subdirectory called PROTECT, beneath
the DBASE subdirectory, copying all files to be protected into that
directory. Step three is to get a printout of all database file structures
that will be used in PROTECT for documentation purposes. Each file will
exist in three separate places: One in the original DBASE subdirectory,
the second on floppy disks, cartridge or tape, and the third set in a
separate PROTECT subdirectory. The files in the PROTECT directory are the
ones we will use in establishing a protected dBASE IV accounting system.
Using the printouts of the database structures and security request forms,
we're ready to enter the PROTECT menus and record user profiles and
privilege schemes. When you start PROTECT, you must enter an Administrator
password. The password should be one that's easy to remember since it is
your only access to a PROTECTed system, once it has been established. If
you forget the password, there is no way of retrieving it later, short of
eliminating the DbSystem.DB file and all encrypted files and starting
over. So store this password in a safe place.
Entering PROTECT Mode
The Protect Menu has four options: Users, Files, Reports and Exit. The
Users option is where we create user profiles which are maintained in the
Dbsystem.DB file. To create a user profile, we must give each user a login
name, password, group name and access level. A full name is optional, it
is used for my benefit to describe in more detail the identity of the
user. The most important assignment in a user profile is the access
level. This level is used in the File menu to establish the type of access
the user will have to a database file and its fields. After completing a
users' profile, we move the cursor down to the 'Store user profile' and
press the Enter key before proceeding with the next user.
User Profiles
We now move to the Files options after entering all my user profiles. Here
we will set up group access by file and field privileges. We will closely
follow the data recorded on the Security Request Forms. The privilege
schemes I create will be saved in the database file structure. Each
database is associated with only one group but one group may have many
associated database files. The two most important sections of this menu
are: File access privileges and Field access privileges. The File access
privileges are those values which are the most restrictive for all users in
the system. Looking over the completed Security Request Forms, enter the
access levels for Read, Update, Extend and Delete.
The 'Field access privileges' relate to the individual access levels from
the Users menu. This is where each field in the database is assigned:
Full, None or R/O (read only) status. After completing the first access
level privileges for a database file, instead of proceeding to the Store
file privileges option, we back up to Access level and proceed to enter the
next level for this database. Continue until all the access levels for a
particular database file have been entered. At that point, you can then
select the Store file privileges option.
Get It in Writing
The next menu option, Reports, is very important. Since the Dbsystem.DB
contains the security status of each user and is an encrypted file, you
will want to have a hard copy of this security information. The User
Information Report lists the user name, password, group, fullname and
access level for each user. The File Information Report gives you the
group name, file access privileges and field privileges. Print this
immediately after the database file is encrypted. Encryption is done
through the Save feature in the Exit menu option. It is important to print
this information entirely; primarily, to verify the accuracy of your
protected system and, secondly, as a hard copy record of your protect
system for documentation and maintenance purposes. Lastly, it may sound
like a broken record but don't forget to save your password in a secure
place.
The Exit option has three features: Save (new and updated user profiles and
privileges schemes - .DBF converted to .CRP files and .DBT become .CRT),
Abandon (cancel all new and updated user profiles and privileges schemes)
and Exit (new and updated user profiles and privileges schemes are
encrypted on the database files if not already done through the Save
option.)
We now exit the PROTECT menu and return to the Control Center or dot
prompt. Before we can test our protected system, we need to rename the
newly encrypted .CRP files to .DBF files. Likewise for encrypted memo
files (.CRT extension) to .DBT memo files. Also, it is imperative to
re-index your databases as well.
Going for a Test Drive
Logging in as the Store Manager, we can test for the accuracy of the group
name, user name and password. In dBASE IV, open the encrypted EMPLOYEE.DBF
file and go into the BROWSE mode. In this mode, we can test for reading,
extending, updating and deleting of records within the database. If the
Store Manager's privilege scheme has successfully passed, we go to the dot
prompt and type LOGOUT. This will allow us to login the next user profile
for testing.
Continue testing the rest of the users for their user profiles and
privilege schemes against the Security Request Forms by repeating the above
procedure. Upon completion of the testing, you can erase the old .DBF and
.DBT files from the system and leave the encrypted files for use in dBASE
IV.
Protect Tips to Keep in Mind
Two types of files are encrypted during the PROTECT session:
database and memo files.
Index files are only encrypted when you REINDEX or create them with
an encrypted database.
Record your Administrator Password! There is no way of recovering
this information if forgotten.
Use the Security Request Forms or a form of your own design to
record user profiles and privilege schemes. A Protected system must be
maintained and regularly updated. These Forms will serve as your
blueprint to the system. Also keep copies of the reports printed from
the Reports Menu in PROTECT.
If you erase the Dbsystem.DB file after your database files have
been encrypted, you wont see the login screen upon entering dBASE IV.
However, when you try to open one of the encrypted database files, you will
get the error message "Database encrypted" and you won't have access to
that file. Secure your Dbsystem.DB files. Make a backup copy.
Backup your encrypted files frequently. If your files are
important enough to be encrypted, then back them up often. This is true of
your unencrypted files. Having current backups will save you time and can
eliminate potentially many hours of recreating your data.
Protect Commands and Functions
LOGOUT
Logs out the current user and presents a new login screen to allow another
user to login. LOGOUT closes all open database files, associated files and
program files.
SET ENCRYPTION ON/OFF
Default is OFF. In a protected system SET ENCRYPTION is ON. With SET
ENCRYPTION OFF an encrypted file can be copied to a decrypted form. This
is usually done to export a file to another application, to make a backup
copy of the file or to use it in a non-protected dBASE IV system.
ACCESS()
Returns the access level in a numeric form of the current user. If a user
has an access level of zero, they can not access any encrypted files. In a
non-protected system, ACCESS() always returns a zero.
USER()
Returns a string containing the user name of the current logged-in user.
The End of Our Saga
Now all is well with the store. In fact, they're getting ready to become a
chain. Great big warehouse type stores with play areas for the kids and a
cafeteria that serves microwaved entrees like Swedish Meatballs. Customers
will come from all over to get great deals on cheap furniture. And as they
rake in the major bucks, they are secure in the knowledge that all their
hard work is under the safe keeping of the dBASE PROTECT utility. The
End.
5 Q&A
Q&A
"Summed Summaries" Exception
Order of Precedence with calculated fields throws a curve ball for some.
I'm so confused. I tried to use the method described in "Summed Summaries"
(TechNotes/dBASE IV, January 1991) but got inaccurate results. I created a
named summary field in the Group Summary band with the "SUM" operator. I
didn't declare it as a "hidden" field because I wanted to see the summary
result. I then created a hidden named calculated field in the same band,
using the name of the summary field as the expression for the calculated
field. Then I put an "Average" type summary field in the Report Summary
Band, summarizing on the hidden calculated field that I had just created in
the Group Summary Band. Doing a spot check of one group, my averages were
lower than expected. What happened?
You got caught in the "order of precedence" squeeze. Since the Group
Summary Band as a named hidden calculated field and a named visible summary
field, the hidden field value is calculated first as both fields are
named. This information is summarized on page 10-20 of Using the Menu
System.
To solve this problem, reverse the status of the two hidden fields in the
Group Summary Band. Make the summary field hidden and "unhide" the
calculated field. This will force the report generator to calculate the
summary field first, followed by the calculated field. The calculated
field will display correctly, so the average in the Report Summary Band
will not be accurate.
Dynamic Colors
How do I allocate colors dynamically in a program? I want to be able to
use memory variables to supply colors for various portions of my
programs.
You can use macro substitution in most cases:
@ 5,5 SAY "Hello world!" COLOR &ClrInfo
However, you cannot use this technique in a file that you SET FORMAT TO.
Macro substitution is not allowed in the region of a .FMT file that has
your @..SAYs and @..GETs.
Double Your Data, Double Your Fun
I have a database where often times the data in one set of fields will be
the same as the data in another set of fields. For example, one set of
fields is for the mailing address of a customer and has the name of the
person who placed an order, and the other set of fields is for billing
information and has the name of the purchasing agent. A lot of times much
of this information is the same. Is there a simple way to have the
information in the first set of fields automatically copied to the second
set during data entry?
If you set up your format screen to be a two page format screen, there is
an extremely simple way. You place the first set of fields on the first
page, and the second set of fields on the second page. For the second set
of fields you go to the Edit options: Default value and enter the names of
the fields from the first set of fields that you want to copy. The catch
is that the fields must be on separate pages for you to do this.
A Unique Problem
If I create an index with unique keys using the Control Center or the dot
prompt, I don't get unique records when I choose the Include Indexes option
in my queries and choose to sort on the field or index expression that is
supposed to have a unique key or conditional index. Finally, sometimes I
don't see some of the tags that I know are part of my database in the Query
generator.
The problem with using an index with unique keys is related to how the
index is created. In order for it to work properly, you must create the
index from the dot prompt with the following commands:
SET UNIQUE ON
INDEX ON
SET UNIQUE OFF
This is the only way to create an index tag that the Query generator will
use as a unique index. In addition, you should create an index expression
such that it is not just the name of your field. In other words, instead
of a field named Lastname being your entire index expression, you should
use Lastname + "" as the index expression. This applies to the situations
where you don't see indexes you know you've created or where certain
conditional indexes aren't filtering out records.
Commands that Take a Back Seat
When I use the command
BROWSE FORMAT FIELDS Lastname, Firstname, Phone
I get more than just these three fields I specified. Why is dBASE IV
ignoring the FIELDS part of the BROWSE command?
The FORMAT clause of the BROWSE command is a great way to add data
validation and formatting to your applications that use BROWSE. When you
use this option, it prioritizes over the FIELDS option. If your format
screen has ten fields defined it, for example, whenever you have that
format screen active and you issue the BROWSE FORMAT command, you will see
all ten of the fields defined in your format screen in your BROWSE screen,
regardless of whether you use the FIELDS clause. If you only want to see
three fields in BROWSE using picture functions and validation, you need to
create a format screen with only those three fields in it, then you can SET
FORMAT to that screen and use the BROWSE FORMAT. This is as designed and
is mentioned in Language Reference.
The Build Bone's Connected to the Print Bone...
When I run the BUILD utility, I can only access the Exit menu? Do I need
to re-install?
The problem is usually that your CONFIG.DB file has a line in it that is
more than 95 characters in length. This is commonly a printer definition
line. Trim it down by removing extra spaces or shortening the name of the
printer. Try BUILD again and it should work.
One Man's Bug is Another Man's Feature
When I have my TYPEAHEAD buffer set to 0, dBASE IV ignores my ON KEY LABEL
definitions. Is this a bug?
This "problem" has to do with how ON KEY LABEL is implemented. It has to
push something onto the keyboard stack and is unable to do this if you SET
TYPEAHEAD TO 0. So think of it as a way to temporarily disable your ON KEY
LABELS without having to "un-define" them and later redefining them. So,
from one perspective, it could be considered anomalous. But there's
probably someone out there thinking, "I was wondering how I could
temporarily disable my ON KEY LABEL commands."
6 Using Lotus Files
Working with Lotus Files
Kevin Ng
When considering a Lotus spreadsheet, as in most software that has been
around for a while, one must consider the evolution of its formats. The
nuances of each version usually show up by way of a modified extension from
the previous version. For instance, Lotus Release 1a creates files with
.WKS extensions, Release 2.x creates .WK1 files and Release 3 type .WK3.
Thankfully, Release 2.x and Release 3 both include a translating utility,
TRANS.EXE, for translating a file to several different formats for downward
compatibility.
Direct File Transfer
There are two commands in dBASE IV or dBASE III PLUS to use with Lotus
files. They are the APPEND and IMPORT commands. What are the
differences? When should one use APPEND or IMPORT?
In dBASE IV, you can find IMPORT FROM in the Tools Menu of the Control
Center. APPEND FROM is in the APPEND Menu of a MODIFY STRUCTURE session.
Both APPEND FROM and IMPORT FROM are commands accessible from the dot
prompt or within a dBASE program.
IMPORT creates a database structure when invoked and can be used where no
dBASE database structure exists. This menu option/command is particularly
useful when you have in your possession only the .WK1 file itself and no
Lotus software by which you could access the contents.
APPEND FROM works with the original Lotus version .WKS files and is used
primarily when one has an existing dBASE database structure and the data
from the Lotus file is to be appended to it.
There are times when there is no alternative but to use the APPEND FROM
option, such as when the database you want to add data to already exists
and whose structure must be conformed to. Under those circumstances, the
Lotus TRANS.EXE can be used to translate a Lotus type .WK1 back to a .WKS
file.
For the latest Lotus files, those with .WK3 extensions, it is advisable to
use the Lotus TRANS.EXE utility to translate the Lotus file directly into a
dBASE IIIr file format. Note that dBASE IV can automatically read a dBASE
III file.
Indirect File Transfer
In the case where indirect file transfer is preferred, the APPEND FROM..SDF
command can be used for adding data to the database and the COPY TO..SDF
command can be used for exporting a database file to an ASCII text file.
SDF stands for System Data Format and is often referred to as Fixed Length
Format. SDF is recommended because Lotus uses only SDF for indirect
transfer.
A database structure with a matching number of fields of the appropriate
length must exist before the APPEND FROM..SDF can be used.
In dBASE IV, you can find APPEND FROM is in the APPEND Menu of a Modify
Structure session. For COPY TO, use the EXPORT menu of the TOOLS menu in
the Control Center. Both APPEND FROM and COPY TO are commands accessible
from the dot prompt or within a dBASE program.
A good way to experiment with appending from or copying to an ASCII text
file is to use a test database to try the APPEND FROM and the COPY TO
commands. The dBASE text editor can be used to view the text file by using
the command
MODIFY FILE
Using the IMPORT.WK1 Command
A common problem of IMPORTing FROM a Lotus .WK1 file is the "Two Decimal
Places Only Syndrome". Columns of numbers come in as fields with decimal
places set at 2 only. For example, 12.3456 comes in as 12.35 (note that it
is rounded up).
A Lotus spreadsheet allows "free" format. Formatting the spreadsheet
globally or by a specified range with fixed decimal places is a Lotus
option. To create a database structure through an IMPORT operation, dBASE
IV needs to know the information on decimal places up front. It uses 2 as
a default unless it can determine from the spreadsheet how it was FIXED in
Lotus.
With the decimal places FIXED at 4, 12.3456 will import precisely. FIXED
at 7 and 12.3456 will show up in the spreadsheet as 12.3456000 and will
come in to dBASE IV as 12.3456000 nicely. So specifying a FIXED number of
decimal places in Lotus is advisable before an IMPORT operation in dBASE
IV.
Generic Field Names
A database file created by the dBASE IMPORT command will have generic field
names such as A, B,.Z, AA, AB, and so on. The first row of a spreadsheet
usually contains the titles of each column. This first row is read in as a
record instead of being used as the names of the fields in the newly
created database structure.
Should this title row in the spreadsheet be all charactersan extremely
likely scenario since most titles incorporate lettersthere's also the
negative effect of all the field types in the newly created database
structure being set to CHARACTER. Therefore, it is advisable to remove the
titles for the IMPORT.
To obtain customized field names as the database file was created, the only
alternative is to use the Lotus translate utility (TRANS.EXE) to translate
the .WK1 file to a dBASE III file format.
Missing Blank Row
Refer to Missing Blank Row under APPEND FROM..WKS below.
Blank rows will be retained as blank records if the Lotus translate utility
file (TRANS.EXE) is used to translate a WK1 file to a dBASE III file.
Using APPEND FROM..WKS
Missing First Row
The first row in the spreadsheet is always dropped during the APPEND FROM
operation. In this case, make sure that the first row is only an
unnecessary header or title. In case the first row contains data, use the
INSERT option in Lotus to insert a blank row as the top row of the
spreadsheet.
Missing Blank Row
A blank row in a spreadsheet will be disregarded by dBASE III PLUS and
dBASE IV and will not be brought in. Should it be necessary to bring in a
blank row as a record, then the blank row must not be blank. Enter a blank
space or a 0 into the last cell of the row and it will be brought in.
Missing First Field
The last cell of a row in the spreadsheet must contain data (even a
physical space). If it is empty, the first field in the next record will
be blank.
In the case where there may be many instances where the last cells in the
last column of the spreadsheet are empty, insert a dummy column as the last
column and copy some dummy data into it. This will also save the missing
blank row.
For example, the database has six fields and in the spreadsheet there are
six columns. In the spreadsheet, insert a seventh column and copy some
dummy data into it.
So, when APPENDing a Lotus type .WKS file into a database, it is most
efficient and effective to use the following steps.
Insert a blank row as the top row of the spreadsheet if there are
no title or header as the first row.
Insert an extra column as the last column and fill this dummy
column with some dummy data.
Don't forget to save the changes.
In Conclusion
As the software programs from different software publishers constantly
change, it becomes somewhat of a challenge for each company's program to
accommodate some other popular format. Proprietary rights, trade secrets
on new file formats, and technologies make things even more difficult. But
most companies make the effort. Sometimes the translation from one format
to another requires a little extra TLC as has been shown here. But, until
that day where everybody is going at the same pace as everybody else in the
software business, having the right information on how import and export
operations work with the least amount of effort will suffice.
7 Self Documenting Forms
Self Documenting Forms
Tom Woodward
Confronting the natural aversion to completing
a process by thoroughly documenting it.
The process of creating a free-standing application in dBASE IV can take a
programmer through a guided tour of different moods and emotions. The
design and implementation of a full-blown database application can be
interesting, and even, depending on your own personal interpretation, fun.
Then comes the sometimes frustrating, yet occasionally rewarding process of
debugging. You then take your finished product to your boss who gives you
the obligatory pat on the back, and then utters the words which drain the
thrill of accomplishment from your blood, and bring dread to you heart:
"Now document this thoroughly so that someone can take over if something
unforeseeable happens to you in the near future."
Documentation: the lowliest task in the development process, a necessary
evil that every programmer needs to tackle before moving on to the next
project. Wouldn't it be great if it could be done for you? If you used
the application generator to create the skeleton of your application, you
can use DOCUMENT.GEN to generate documentation automatically, but what
about all those involved screen formats, and those tricky reports. Well,
by modifying FORM.GEN and REPORT.GEN, we can have self-documenting form and
report generators to go along with out existing self-documenting menu
generator.
In the interest of conserving space and time, we are only going to cover
how to create a self-documenting screen form, along with a solemn promise
from the author to tackle the report documentor in the not too distant
future issue.
The increased complexity of the dBASE IV format files brings up the
necessity for in depth documentation. Validation and conditional edit
clauses, default value and calculated field expressions, edit prompt and
unaccepted value messages, as well as the screen number (for multi-screen
formats) and screen position of the field all need to be documented for
easy review.
To create the form documentor we simply need to declare the extra variable
names that our routine uses. Just insert these at the top of FORM.COD
after the existing "var" statement (it ends after the variable declaration
terminated with a ";") as follows:
//
// Documenting procedure variables
//
var
parse str, // String to parse during line formatting
temp str, // Temporary string used for parsing
test1, // Temporary variable use in line formatting
remain str, // Text left over from formatting procedure
last at, // Position where left off last line formatted
cntr1, // Counter variable
breakpt1, // Point in string where string is broken for
eoln
breakpt2, // Point in string where string is broken for eoln
first time, // First time through loop flag
first, // First time through loop flag
first test; // First time through loop flag
After that's done we just insert the lines of code at the top of the body
of FORM.COD which will call the documenting procedures:
//
// Call documenting procedures
//
if !make Fmt doc() then goto nogen
doc header() // Print Header in the Doc file
fmt doc body() // Create Documentation
Note: Because the form documenting version of FORM.GEN is somewhat slower
in generating, you may want to make a copy of the normal FORM.GEN before
overwriting it with the documenting version. The non-documenting
generator could be used during development, then when all the bugs and
design issues have been resolved you can use the documenting generator.
Then, at the bottom of FORM.COD we need to add the documenting procedure
which is as follows:
//---------------------------------------------
// Documenting procedure definitions
//---------------------------------------------
//----------------------------------------------
define fmt doc body()
testing = line cnt = wnd cnt = 0
foreach ELEMENT k
//
if ELEMENT TYPE == @FLD ELEMENT then
new page(k)
if FLD FIELDTYPE == memvar then}
Variable Name:{space(indent no - 14)}{cap first(FLD FIELDNAME)}
{ else}
Field Name:{space(indent no - 11)}{cap first(FLD FIELDNAME)} \
{ if FLD FIELDTYPE == calc then}
(Calculated)
{ else}
{ endif}
{ endif}
Field Type:{space(indent no - 11)}\
{ case FLD VALUE TYPE of
case type char:}
Character
{ case type date:}
Date
{ case type float:}
Floating Point
{ case type bool:}
Logical
{ case type memo:}
Memo
{ otherwise:}
Numeric
{ endcase
if Ok Template(k) then
remain str = ""}
Picture Clause:{space(indent no - 15)}\
{wrap string(FLD TEMPLATE,width of wrap,0,remain str);}
{remain str}
{ else}
Picture Clause:{space(indent no - 15)}None
{ endif}
{ print( replicate( "-",78) + crlf);}
{ if FLD PICFUN then}
Picture Functions:{space(indent no - 18)}{get functions(k);}
{ print( replicate( "-",78) + crlf);}
{ endif
if FLD FIELDTYPE == calc then
if FLD DESCRIPT then
remain str = ""}
Description:{space(indent no - 12)}\
{wrap string(FLD DESCRIPT,width of wrap,0,"");}
{remain str}
{ endif}
Expression:{space(indent no - 11)}\
{ last at = 0
remain str = ""
foreach FLD EXPRESSION cntr1 in k}
{wrap string(FLD EXPRESSION,width of wrap,last at,remain str);}\
{ next}
{remain str}
{ print( replicate( "-",78) + crlf);}
{ endif
if chr(FLD VALUE TYPE) == "M" && FLD MEM TYP then
if wnd cnt < 20 then ++wnd cnt endif}
Window Name:{space(indent no - 12)}Wndow{wnd cnt}
Window Coordinates:{space(indent no - 19)}{Doc Box(k)}
{ print( replicate( "-",78) + crlf);}
{ endif
if FLD L BOUND or FLD U BOUND then}
Valid Range:{space(indent no - 12)}\
{ if FLD L BOUND then}
From: \
{ last at = 6
remain str = ""
foreach FLD L BOUND cntr1 in k}
{wrap string(FLD L BOUND,width of wrap,last at,remain str);}\
{ next}
{remain str}
{ endif
if FLD U BOUND then
if !FLD L BOUND then}
To: \
{ else}
{space(indent no)}To: \
{ endif
last at = 6
remain str = ""
foreach FLD U BOUND cntr1 in k}
{wrap string(FLD U BOUND,width of wrap,last at,remain str);}\
{ next}
{remain str}
{ endif}
{ print( replicate( "-",78) + crlf);}
{ endif
if FLD OK COND then}
Accept Value When:{space(indent no - 18)}\
{ last at = 0
remain str = ""
foreach FLD OK COND cntr1 in k}
{wrap string(FLD OK COND,width of wrap,last at,remain str);}\
{ next}
{remain str}
{ if FLD REJ MSG then
remain str = ""}
Unaccepted Message:{space(indent no - 19)}\
{wrap string(FLD REJ MSG,width of wrap,0,"");}\
{remain str}
{ endif}
{ print( replicate( "-",78) + crlf);}
{ endif // FLD OK COND
if FLD ED COND then}
Edit Value If:{space(indent no - 14)}\
{ last at = 0
remain str = ""
foreach FLD ED COND cntr1 in k}
{wrap string(FLD ED COND,width of wrap,last at,remain str);}\
{ next}
{remain str}
{ print( replicate( "-",78) + crlf);}
{endif
if FLD DEF VAL then}
Default Value:{space(indent no - 14)}\
{ last at = 0
remain str = ""
foreach FLD DEF VAL cntr1 in k}
{wrap string(FLD DEF VAL,width of wrap,last at,remain str);}\
{ next}
{remain str}
{ print( replicate( "-",78) + crlf);}
{ endif
if FLD HLP MSG then
testing = 1
remain str = ""}
Help Message:{space(indent no - 13)}\
{wrap string(FLD HLP MSG,width of wrap,0,"");}
{remain str}
{ print( replicate( "-",78) + crlf);}
{ testing = 0
endif
if !FLD EDITABLE or FLD CARRY}
Attributes:{space(indent no - 11)}\
{ if !FLD EDITABLE then}
Read Only\
{ if FLD CARRY then}
, \
{ endif}
{ endif
if FLD CARRY}
Carried Forward\
{ endif}
{ print( replicate( "-",78) + crlf);}
{ endif}
Position:{space(indent no - 9)}Row: {ROW POSITN - line cnt} \
Column: {COL POSITN} Screen No.: {page cnt}
{ print( replicate( "=",78) + crlf);
print( replicate( "=",78) + crlf);}
{ endif
++cnt;
next k
return;
// eof - fmt file body()
enddef
//--------------------------------------------------------------
define get functions(c)
if c.FLD PICFUN then
first = 1
if at("Z", c.FLD PICFUN) then}
Leave blank when value is Zero
{ first = 0
endif
if at("L", c.FLD PICFUN) then
if first then}
Pad number with leading zeroes
{ first = 0
else}
{space(indent no)}Pad number with leading zeroes
{ endif
endif
if at("$", c.FLD PICFUN) then
if first then}
Display number in financial format
{ first = 0
else}
{space(indent no)}Display number in financial format
{ endif
endif
if at("^", c.FLD PICFUN) then
if first then}
Display number in exponential format
{ first = 0
else}
{space(indent no)}Display in number exponential format
{ endif
endif
if at("C", c.FLD PICFUN) then
if first then}
Follow positive credits with "CR"
{ first = 0
else}
{space(indent no)}Follow postive credits with "CR"
{ endif
endif
if at("X", c.FLD PICFUN) then
if first then}
Follow negative debits with "DB"
{ first = 0
else}
{space(indent no)}Follow negative debits with "DB"
{ endif
endif
if at("(", c.FLD PICFUN) then
if first then}
Put parentheses around negative numbers
{ first = 0
else}
{space(indent no)}Put parentheses around negative numbers
{ endif
endif
if at("T", c.FLD PICFUN) then
if first then}
Trim trailing blanks from expression
{ first = 0
else}
{space(indent no)}Trim trailing blanks from expression
{ endif
endif
if at("B", c.FLD PICFUN) then
if first then}
Left align expression
{ first = 0
else}
{space(indent no)}Left align expression
{ endif
endif
if at("I", c.FLD PICFUN) then
if first then}
Center align expression
{ first = 0
else}
{space(indent no)}Center align expression
{ endif
endif
if at("A", c.FLD PICFUN) then
if first then}
Allow input of alphabetic characters only
{ first = 0
else}
{space(indent no)}Allow input alphabetic characters only
{ endif
endif
if at("!", c.FLD PICFUN) then
if first then}
Convert entry/expression to uppercase
{ first = 0
else}
{space(indent no)}Convert entry/expression to uppercase
{ endif
endif
if at("R", c.FLD PICFUN) then
if first then}
Literals in template are not part of data
{ first = 0
else}
{space(indent no)}Literals in template are not part of data
{ endif
endif
if at("S", c.FLD PICFUN) then
if first then}
Scroll within a display width of: {c.FLD PIC SCROLL}
{ first = 0
else}
{space(indent no)}Scroll within a display width of: {c.FLD PIC SCROLL}
{ endif
endif
if at("M", c.FLD PICFUN) then
if first then}
Selection to be made from the following choices:
{ first = 0
else}
{space(indent no)}Selection to be made from the following choices:
{ endif
last at = 0
remain str = ""
first test = 1
foreach FLD PIC CHOICE cntr1 in c
if first test then}
{space(indent no)}{wrap string(FLD PIC CHOICE,width of wrap,last at,remain
str);}\
{ first test = 0
else}
{wrap string(FLD PIC CHOICE,width of wrap,last at,remain str);}\
{ endif
next}
{remain str}
{ endif
endif
enddef
//--------------------------------------------------------------
define make fmt doc()
// Attempt to create document file.
default drv = strset( defdrive) // grab default drive from dBASE
fmt name = FRAME PATH + NAME // Put path on to object name
if not fileok(fmt name) then
if !default drv then
fmt name = NAME
else
fmt name = default drv + ":" + NAME
endif
endif
fmt name = upper(fmt name)
if not create(fmt name+".DOC") then
pause(fileroot(fmt name) +".DOC" + read only + any key)
return 0;
endif
return 1;
enddef
//--------------------------------------------------------------
define doc header()
// Print Header in program
print( replicate( "=",78) + crlf);
print( replicate( "*",78) + crlf);}
*
*-- Data Description For: {filename(fmt name)}FMT
*-- Date................: {ltrim( substr( date(),1,8))}
*-- Version.............: dBASE IV, Format {FRAME VER}.1
*
{ print( replicate( "*",78) + crlf);
print( replicate( "=",78) + crlf);}
{
enddef
//--------------------------------------------------------------
define doc box(cur) // Pass in foreach cursor
// Build box coordinates for a dBASE window command
var result, temp page, line cnt;
temp page = page cnt;
// Adjust box coordinates so that negative numbers are not
generated
do while ( nul2zero(cur.BOX TOP) - (scrn size * temp page) ) <= 1
temp page = temp page - 1
enddo
if page cnt == 1 then
temp page = 0
endif
if page cnt == 2 then
temp page = 1
endif
if !temp page then
line cnt = 0
else
line cnt = (scrn size * temp page) + (1 * temp page)
endif
result = nul2zero(cur.BOX TOP) - line cnt + ", "
result = result + nul2zero(cur.BOX LEFT) + " To: "
temp = nul2zero(cur.BOX TOP) + cur.BOX HEIGHT - line cnt - 1
if temp > scrn size then temp = scrn size endif
result = result + temp + ", " + (nul2zero(cur.BOX LEFT) + cur.BOX
WIDTH - 1)
"From: " + result
enddef
//--------------------------------------------------------------
define wrap string(string2wrap, line width, last at, left over)
if len(left over) + len(alltrim(string2wrap)) > 237 then
parse str = left over + substr(alltrim(string2wrap), 1,
len(alltrim(string2wrap)) - len(left over))
left over = substr(alltrim(string2wrap),
len(alltrim(string2wrap)) - len(left over) + 1)
else
parse str = left over + alltrim(string2wrap)
left over = ""
endif
first time = 1
do while len(parse str) + len(left over) > line width - last at
breakpt1 = 0
temp str = parse str
do while breakpt1 + at(" ", temp str) < line width - last
at && at(" ", temp str)
breakpt1 = breakpt1 + at(" ", temp str)
temp str = substr(temp str, at(" ", temp str) + 1)
enddo
breakpt2 = 0
temp str = parse str
do while breakpt2 + at(")", temp str) < line width - last
at && at(")", temp str)
breakpt2 = breakpt2 + at(")", temp str)
temp str = substr(temp str, at(")", temp str) + 1)
enddo
if !breakpt1 || breakpt1 > breakpt2 then
if !breakpt1 then
if first time then}
{alltrim(substr(parse str, 1, line width - last at))}
{ first time = 0
parse str = alltrim(substr(parse str,
(line width - last at) + 1, len(parse str) - (line width - last at)))
last at = 0
else}
{space(indent no) + alltrim(substr(parse str, 1, line width))}
{ parse str = alltrim(substr(parse str, line width + 1,
len(parse str) - line width))
endif
else
if first time then}
{alltrim(substr(parse str, 1, breakpt1))}
{ first time = 0
last at = 0
else}
{space(indent no) + alltrim(substr(parse str, 1, breakpt1))}
{ endif
parse str = alltrim(substr(parse str,
breakpt1 + 1, len(parse str) - breakpt1))
endif
else
if first time then}
{alltrim(substr(parse str, 1, breakpt2))}
{ first time = 0
parse str = alltrim(substr(parse str,
breakpt2 + 1, len(parse str) - breakpt2))
last at = 0
else}
{space(indent no) + alltrim(substr(parse str, 1, breakpt2))}
{ endif
parse str = alltrim(substr(parse str, breakpt2 + 1,
len(parse str) - breakpt2))
endif
if left over then
if len(left over) + len(parse str) <= 237 then
parse str = parse str + left over
left over = ""
endif
endif
enddo
if first time then}
{alltrim(parse str)}\
{ else
breakpt1 = 0
temp str = alltrim(parse str)
do while at(" ", temp str)
breakpt1 = breakpt1 + at(" ", temp str)
temp str = substr(temp str, at(" ", temp str) + 1)
enddo
breakpt2 = 0
temp str = alltrim(parse str)
do while at(")", temp str)
breakpt2 = breakpt2 + at(")", temp str)
temp str = substr(temp str, at(")", temp str) + 1)
enddo
if breakpt1 >= breakpt2 then
if !breakpt1 then
last at = 0
remain str = parse str
test1 = ""}
{space(indent no)}\
{ else}
{space(indent no) + substr(parse str, 1, breakpt1)}\
{ if len(parse str) > breakpt1 then
remain str = substr(parse str, breakpt1
+ 1)
endif
last at = breakpt1
endif
else}
{space(indent no) + substr(parse str, 1, breakpt2)}\
{ if len(parse str) > breakpt2 then
remain str = substr(parse str, breakpt2 +
1)
endif
last at = breakpt2
endif
endif
enddef
// EOF: Form.cod
}
To see what this auto-documentor can do, let's assume you've created a
screen for the file "Orders" which has the following fields:
Part_Name, Part_ID, Date_Order, Discontinu, Cost, Qty_2Order, Qty_OnHand,
Comments, Terms, Discount.
After compiling the documenting .COD file and making it the default
template for form generation, resaving the form will create a file with
same name as your screen form file and an extension of .DOC. A sampling of
this printout is shown on the following page.
==============================================================================
******************************************************************************
*
* Data Description For: ORDERS.FMT
* Date................: 3-31-91
* Version.............: dBASE IV, Format 1.1
*
******************************************************************************
==============================================================================
Field Name: Part name
Field Type: Character
Picture Clause: XXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
Help Message: Enter the Part's Name
Position: Row: 2 Column: 12 Screen No.: 1
==============================================================================
==============================================================================
Field Name: Part id
Field Type: Character
Picture Clause: 99-9999-9999
Picture Functions: Literals in template are not part of data
Help Message: Enter the Part Number
Position: Row: 2 Column: 58 Screen No.: 1
==============================================================================
==============================================================================
Field Name: Date order
Field Type: Date
Picture Clause: None
Help Message: Enter the Date of Order
Attributes: Carried Forward
Position: Row: 3 Column: 12 Screen No.: 1
==============================================================================
==============================================================================
Field Name: Discontinu
Field Type: Logical
Picture Clause: Y
Position: Row: 4 Column: 58 Screen No.: 1
==============================================================================
==============================================================================
Field Name: Cost
Field Type: Numeric
Picture Clause: 999999.99
Picture Functions: Display number in financial format
Help Message: Enter the Cost of this part
Position: Row: 6 Column: 12 Screen No.: 1
==============================================================================
==============================================================================
Field Name: Qty 2order Field Type: Numeric Picture
Clause: 9999
Edit Value If: .NOT. DISCONTINU
Help Message: Enter the quantity that is to be ordered for
this part
Position: Row: 6 Column: 39 Screen No.: 1
==============================================================================
==============================================================================
Field Name: Qty onhand
Field Type: Numeric
Picture Clause: 9999
Help Message: Enter the Quantity that we have on hand for
this part
Position: Row: 6 Column: 58 Screen No.: 1
==============================================================================
==============================================================================
Field Name: Discount Field Type: Numeric Picture
Clause: 99
Accept Value When: DISCOUNT = IIF( Lead Time > 120, 0, IIF(
Lead Time <= 120 .AND. Lead Time > 90, 5,
IIF(Lead Time <= 90 .AND. Lead Time > 60,
10, IIF( Lead Time <= 60 .AND. Lead Time >
30, 15, 20 ))))
Unaccepted Message: Lead Time
>120:Dis=0;<=120,>90:Dis=5;<=90,>60:Dis=10;<=6
0,>30:Dis=15;<=30:Dis=20
Help Message: Enter the Percentage Discount we will
realize depending on lead time.
Position: Row: 11 Column: 66 Screen No.: 2
====================================================================================================================================================
December 7, 2017
Add comments