Design Standards Survey

Several variables play a critical role in determining the cost of designing a customized database.  To help predict the time and expense of designing a database for your organization, please consider the following questions and indicate your firm’s preferences.

Please feel free to call and discuss these issues further if it would help you with your answers.  Your answers to these questions are an important factor in time & cost estimates.

Debugging

Software design of all types follows the ancient academic/scientific practice of advancing a theory, testing the theory, correcting flaws revealed by the tests, and then repeating that cycle until no flaws are revealed or until the flaws are deemed inconsequential.

In modern-day parlance, this is called debugging.

You can save money by doing more of the testing yourself.  For some businesses this is acceptable, for others, it isn’t.  Doing more of the testing yourself also means your database will be delivered sooner.  The testing required to achieve a very high confidence level requires additional time.  Sometimes a flawed database is so much better than the status quo that an organization will choose less debugging in the interest of faster delivery.

We would never deliver software with known problems, but the amount of effort spent finding problems can be adjusted to the user’s needs. With these factors in mind, please indicate the confidence level you desire in your customized database:

Lower Cost
More Problems
Faster Delivery
  Higher Cost
Fewer Problems
Slower Delivery
50% 60% 70% 80% 90% 100%

User Interface

How easy should it be to learn to use your database?
At one end of the spectrum, imagine a database to put in a front office for anyone off the street to sit down and search the company’s archives for information. After less than five minutes of instruction from the receptionist, an inexperienced user could click his/her way through the database, performing searches and sorts, browsing multiple screens of information in each record and making printouts to take home. Every step is presented in an easy-to-follow manner with tutorial screens and with all possible blind alleys and detours carefully blocked to prevent the user from getting lost.

At the other extreme, imagine a basic database structure with the files, fields and some rudimentary screens used for viewing and printing. You learn FileMaker Pro from the manuals, from a colleague, or from a training class. You can then design and modify screens and reports, use FileMaker’s built-in find and sort procedures and otherwise navigate through the database manually.

In between these two extremes is probably the right point for your organization.  A simple-to-use interface requires more time to design, but you will reap long-term savings with a shorter training curve for your users and less possibility of user mistakes.  On the flip side, changes to the database are more likely to require the assistance of a qualified consultant.

How quickly, as measured in hours of training, should someone be able to learn how to effectively use the database for routine functions?

Lower Design Cost
More Training
More Pitfalls
Easier For You To Change
  Higher Design Cost
Less Training
Fewer Pitfalls
Harder For You To Change
8 hours 6 hours 4 hours 2 hours 0 hours

Data Conversion

(Please skip this question if you don’t currently have data in digital form that needs to be imported into the new database.)

The foundation of a database is the file and field structure that determines how the data is organized. If you already have data in a digital form, whether it’s in a FileMaker database or another form, the file and field structure is different than it will be in a new database.  Otherwise, you wouldn’t need a new database.

The process of moving the data from the old format to the new format is called data conversion.

The bulk of data conversion is performed in large blocks. Typical example:  A single field in the old database that contained the city and state might be split into two fields, one for city and another for state, in the new database.  As long as large groups of records fit a given pattern, the computer is an effective tool for data conversion.  A skilled database professional is also invaluable for extracting data from old files.

At some point, however, the law of diminishing returns usually kicks in.  A few problem records or fields will resist any sort of logical approach either because of flaws in the structure of the old database or in the way data was entered.  At that point, you might decide that the data isn’t worth the effort or decide that it’s so important it should be entered by hand into the new database.

Predicting the nuances and details of data conversion is often the most difficult part of forecasting a project’s cost.  Recognizing that neither of us can foresee with absolute certainty what sort of problems are likely to emerge in the data conversion process and what pieces of data are likely to become problems, please estimate what percentage of your old data is likely to be important to you.

Lower Cost   Higher Cost
75% 80% 85% 90% 95% 100%

Access To Data During Development

Some organizations,  require constant access to their data even while the design is being finalized and the data conversion is under way.

There are trade offs in all aspects of database development, efficiency, and monitoring of project progress, that offer pros and cons for doing database development on site, or taking the development to the offices of the developer.

In the most extreme case, an organization can’t stand to be without access for a single moment.  This requires more planning in advance of the data conversion, more debugging before installation and for some work to occur only at night or on weekends.

In this context, assuming that periods of inaccessibility are scheduled in advance, how long can your organization be locked out of the databases without undue hardship in the interest of keeping costs lower?
____ hours / days   (circle one)

Features vs. Speed

On a given computer or network, a database with fewer features will run faster than a database with more features.  A computer will process a certain number of instructions in a given amount of time (this is what “megahertz” is all about when comparing hardware). If some of that processing time is used for extra features, it will perform its core functions more slowly.

For example, the computer might automatically generate a salutation for a letter based on someone’s name as opposed to having the user enter a salutation. This sounds fine, but if the database is rarely used to generate personal letters, it’s a waste of resources.

In this light, on an arbitrary scale of 1 to 5, what is your bias on the question of features vs. speed?

More speed; fewer features   More features; less speed
1 2 3 4 5

Priorities

A project of this nature involves a multitude of trade offs.  By now, you’ve dealt with most of the either/or situations.  Yet there are many other factors by which you might judge the success of a project. Listed below are some of the criteria that might be important to you. There are also blanks for you to enter other criteria. Please rank these criteria on a 1-2-3 basis where 1 is your top priority. Enter a 0 for items that aren’t important at all.

Rank
___ Low cost
___ Accuracy in handling data
___ Fast delivery
___ Staff satisfaction
___ Features
___ Speed of database in use
___ Complete data conversion
___ Short training curve
___ Visual appeal of layouts and reports
___ Avoidance of user errors
___ Minimal disruption to office operations
___ _______________________
___ _______________________
___ _______________________