It is often said that VFP makes an excellent front-end for Client/Server (C/S) systems. VFP offers us two mechanisms for accessing C/S data – Remote Views and SQL Pass-Through.
Remote Views are the VFP component most commonly used for Client/Server applications. But what if the remote system has some commands or features that VFP doesn’t understand? That’s where SQL Pass-Through comes in. With SQL Pass-Through, a VFP programmer can send commands to the back-end that are only understood by the back-end. This session shows what you can do with Remote Views and SQL Pass-Through. We will cover primary key generation, field and record validation and update conflicts. Eldor Gemst shows you all of this and more in this intermediate-level session.
This session gets you up to speed quickly. What can you do with a Remote View? How do you set one up? What tricks can save you time and frustration? How do you handle update conflicts? How can you get more power than is apparent through the View Designer? Eldor Gemst shows you all of this and more in this intermediate-level session.
It is often said that VFP makes an excellent front-end for Client/Server (C/S) systems. VFP offers us two mechanisms for accessing C/S data – Remote Views and SQL Pass-Through. Once you have set it up, a Remote View behaves and is used basically just like a Local View. Very powerful and easy to use. But what if the remote system has some commands or features that VFP doesn’t understand? That’s where SQL Pass-Through comes in. With Pass-Through, a VFP programmer can send commands to the back-end that are only understood by the back-end. Meaning that it is unlikely that you’ll ever find a Client/Server situation where VFP would be a poor front-end choice. This session focuses on Remote Views, but we will incorporate some SQL Pass-Through code.
Views in general, can be thought of as virtual tables. Once they are created, they are used just like tables (you even open them with the USE command). Unlike a query, there is no QPR file to be executed before the data is obtained. Local Views are used with local Visual FoxPro tables, and Remote Views with any other data. All that is required is the ability to set up an ODBC connection to the data. Views can also be “Parameterized”, meaning that they are really easy to set up in such a way that they only give you data which matches your criteria, such as all invoices for the current client, etc.
Remote Views can be used in various situations… The most common being to access Client/Server data on a foreign system, such as SQL Server, SYBASE, Oracle, etc. Another use might be to access FoxPro data in the case where an existing FP 2.x system is currently in use and a new Visual FoxPro system is being developed. The problem is that once you bring the 2.x tables into Visual FoxPro, they will no longer be accessible from FoxPro 2.x. So the old system will no longer be able to use the data. You could of course duplicate the data, with one set of data being used by the old 2.x system and the new set being used by Visual FoxPro. In the real world this seldom works, as once you have duplicate data, besides the wasted disk space, you run a real risk of data integrity being damaged. Will the users always remember to make changes to both systems? Unlikely!
The solution here might be to create the new VFP system using Views of the 2.x tables, which would allow for one set of data to be used in the system instead of two. An added advantage is that if your application is constructed to use Views of the data, scalability becomes far simpler. If the system grows significantly, a “back-end” server can be much more easily substituted for handling the data, without having to make major rewrites of the application.
Producing sample data for this session was a bit of a challenge, especially since there is a strong chance many of you might not actually have SQL Server available on your machines. And even if you did, SQL Server data isn’t as simple to move from machine to machine by floppy as VFP data. So I had to come up with samples which could be used by most of you.
For those of you with SQL Server available, I elected to include a series of SQL scripts (included with the source code for this session) which could be run to produce an SQL Server database and all the tables we need for this session. For those of you without SQL Server, I have provided a VFP database and tables with similar (as similar as possible) referential integrity rules, field-level validation rules and default values for primary key generation. You need to set up an ODBC connection to this database and then most of the sample code will work properly. I’ll endeavor to point out the differences as we come to them.
Please see the end of this article for details on how to set things up.
The first requirement, and it is of absolute importance, is a familiarity with the back-end engine and data. You’re dreaming if you think you can successfully create a Client/Server system without adequate knowledge of the back end. You need to know data structures, primary and foreign keys, “constraints” (which are a type of rule), triggers, procedures stored on the back-end, and any special characteristics of the back-end that might affect your work.
In addition, development of a Client/Server system should be a partnership between the front and back-end developers. It must be decided where certain operations should be performed, based on the strengths/weaknesses of the systems involved as well as expected users and traffic on the system. See the end of these session notes for some suggested reading on the topic.
The way we access remote data (whether with a Remote View or SQL Pass-Through) is by way of an ODBC connection. So the first step in our process does not even require FoxPro.
Go to Windows Control Panel and click on the 32-bit ODBC Administrator. This is where you establish what data will be available to your applications (whether Visual FoxPro, or even Word or Excel). Every database you want to connect to will need an ODBC data source to be defined. Look first at the ODBC Drivers page to make sure you have a driver for the database you want to connect to.
If you are connecting to a SQL Server database, and you have installed Visual Studio 6.0 Enterprise (probably the non-Enterprise version also) you will end up with SQL Server driver version 3.60.0319. (Unless you also install the beta of SQL Server 7.0, in which case you will have a newer driver.) There seems to be a problem with this driver for some installations where everything works, but picking tables for the Remote View in VFP results in a delay of over 30 seconds from picking the table until it actually appears in the View Designer. The same delay occurs when saving your View or next opening it in the Designer. It seems to work alright, other than the slowness in these operations. I had this problem on two of my machines, and the third machine appeared to have the same version of the driver but it was working perfectly. Lots of frustration and many CompuServe and Internet messages later, I finally gave up, and reformatted the drive and installed a fresh copy of Windows NT and all my software. <sigh> It cured the problem, but it reminds me of the old joke where the doctor says the operation was a success but the patient died. I’m still trying to narrow down what causes this problem and if you experience it, I would appreciate an email message telling me about it. I suspect it might have to do with having had some beta versions of Visual Studio installed on the machine prior to the shipping version,, but cannot confirm this. Maybe this is one of the reasons why the only safe way to uninstall the beta some said, was to reformat. I’m still using that 3.60.0319 version of the driver. Another funny thing about that driver is that the folks on the SQL Server forum of CompuServe are convinced by the version number that this is a beta driver and shouldn’t be available anywhere. But it is indeed the version that gets installed with Visual Studio 6.0.
You can see (above) that we have created a System DSN connection to our Friends test database. One of the first decisions we need to make is what type of DSN to choose. (DSN means Data Source Name.) What are the differences? Here is an overview:
|DSN (Data Source Name) Types and Uses|
Are local to the computer and can only be used by the current user
Are local to the computer but can be used by any user with privileges or by the system itself
File-based data source that can be shared by all users with same drivers installed
Which one you choose depends on which machine you are on, where the data resides, and which other users will need access to this data. Note that an ODBC DSN needs to be created on each machine that will access the data.
Once you are back in Visual FoxPro and you want to create a Remote View, you need to decide whether to use the DSN you created directly, or to use a “Connection” to that DSN. Several issues are involved here, among them performance and licensing requirements (some servers are licensed by connections). One big advantage to using a Connection from VFP instead of a data source is if ever in the future you need to change the back-end database – you only have to redefine the Connection properties. This is much less work. I suggest you use Connections.
So in the database container (DBC) the first thing you should do is create a Connection. Right-mouse click on the DBC background and choose “Connections”. If you’re creating a new one here, click on New. Here is the Connection we are using for our example:
For performance reasons, we have chosen an “Asynchronous” connection rather than “Synchronous”. This means that control will return to the application right away – before the query has finished running on the back-end. This gives the perception of much greater speed. With a Synchronous connection, control is not returned to the application until all the result set has been returned.
When you save your Connection, it is a good idea to respect naming conventions. We have named our Connection “conFriends” to tell us it is a connection to the Friends database.
The biggest performance boost with Remote Views is obtained simply by returning small data sets. Do not bring down data unnecessarily. This may involve a paradigm shift for the developer (and maybe for the user). With local VFP data, we do not think anything of letting the user see all our data. With C/S data, this is simply not done. Data sets could be enormous and in this case it normally doesn’t even make sense to the user to see all the data. Therefore the developer really should ask the user to refine the search criteria for the returned records. This is normally done with the use of Parameterized Views. The best place to see some of the important settings for Remote Views is through the Tools|Options|Remote Data dialog:
These settings can be controlled on a view-by-view basis, but with this dialog you can more clearly see the important ones. Turn on Share Connection, as it is very easy to run out of connections if you’re not careful. The C/S administrator can also configure a higher number of permitted connections (as long as the licensing of the server product allows this) but it is a good idea to be stingy with the creation of unnecessary connections.
Another setting that could be important is the Fetch Memo. You may wish to turn this OFF (it is ON in our graphic) so that memo data (which could be very large) is not brought down from the server without need. It should only be brought down when the user needs to see it, and this could greatly improve performance and reduce system traffic. Still on the topic of memos, you might also wish not to compare memo data when saving the edited data, (depending on your system’s requirements) as it might be very unlikely that two or more users have all edited the same memo data. Not comparing the memo data can give a significant performance boost.
The Fetch Size (records to fetch at a time) only works with an Asynchronous connection, and it allows processing to continue on the client machine as soon as the number of records set here is returned. While in the background the remaining records are obtained. This really gives a perception (and it’s really only the perception that counts – if a user thinks it’s fast, it is fast) of speed. (Remember to SQLCANCEL() before calling a subsequent query because the first one might still be running.)
If the C/S table uses bit data types (closest data type to VFP’s logical fields) and you are adding new records in your VFP Remote View, you must make sure your view supplies a default value to those fields, otherwise adding the record might fail. Unless the C/S database supplies default values to those fields. This again underscores the importance of knowing the back-end very well and of joint development efforts between the front and back-end developers. You need to know all the details of the C/S data… What validations there are, what constraints, defaults, triggers, etc.
Views in VFP do not have the ability to have Triggers attached, but we do have the ability to add record-level validation. You just can’t get to this ability through the View Designer. Use DBSETPROP() to set the RuleExpression property, etc. See further in these notes where GENDBC is used to see some of the settings you could change programmatically, and also see the Help File for DBGETPROP() which lists all the available properties.
Often, when accessing a remote system, there is primary key generation in effect on the back end server. Therefore when a new record is added in the Remote View in VFP, the best solution is to leave that primary key field blank so that when the record is sent to the back-end, the back-end can supply a primary key value. If we attempt to add a value in our View, we run the risk that the value we pick won’t be accepted by the C/S system (maybe it’s already in use) or that another user on the system might be adding the same value. If you really do need to supply a value in your VFP application, careful thought must be given to developing a key generation system which won’t conflict with other users. Maybe each workstation will have its own range of available numbers, or another number (or letters) are added to the primary key value denoting the workstation/user who added the record. This ensures that if two users are adding a key value of “1000” at the same time, one might be “A1000” and the other “B1000” or something along those lines.
Another approach would be to have a “seed” table located on the server, and a stored procedure that we would call from any workstation which needs a new primary key value for a table. This is the same technique we often use with local databases where we have a NextID table and a function to get us that value. The code shows a sample of how this can be done.
Once you have your connection created, it’s time to make the Remote View. Right-click on the background of the DBC and choose New Remote View. If you have “Prompt for Wizards” turned on, don’t choose the Wizard. (My personal feeling is that some wizards are great and others are, well, sort of “brain dead”. Let’s just say that the View Wizard isn’t one of the better wizards.) Choose “New View” and you will be prompted to pick a connection (which is the default). The connection you created earlier should be listed, and you should click Ok. If you decided to use a data source instead of a connection, click the appropriate option button at the bottom of the dialog.
The next step will either be a Login dialog (if you did not enter your Login ID and Password in the Connection itself) or the Open dialog which lets you choose the table(s) you want from the database. The real trick with using Views is in setting the Update Criteria. Here is the Update Criteria Page:
If you access a C/S system such as SQL Server (as our example does) the View Designer will recognize any primary keys from the remote table and all the settings will already have been made (except one). The middle section has a column with a key in the header, and that denotes the Primary Key. You can change it to something else if you need to. The second column (with the pencil) tells which columns should be updateable. Note that the Primary Key is by default not updateable. Generally this is considered the best approach as it’s important not to change a Primary Key for an existing record while other users might perhaps be editing the same record. When they try to send their updates, the original record would not be found and the update would fail. If you have a good reason to allow updating the Primary Key, then simply click in that column to make it updateable.
The most confusing part of this dialog is the upper right section: “SQL WHERE clause includes”. I like to explain this section as being the criteria under which an update should fail. Look at it this way: If the Key field was changed by someone else while you were editing a record, should your update be allowed? What would you update? How would you find the original record? It’s pretty obvious that in this case the update should fail. That is why all the options in that section include the Key field. The default setting, “Key and modified fields” is really quite exciting, as it has fabulous conflict resolution built-in. This option means if you have edited the credit limit and another user changed the telephone number (and saved it) while you were editing, your edit of the credit limit would be allowed, and you would not overwrite the other user’s change of the telephone number. Pretty cool, isn’t it? The “Key and updateable fields” option means if another user changed any of the fields you identified in the View as being updateable, even if you didn’t change them while editing, the update will fail. The final option is “Key and timestamp”. Most C/S systems have a timestamp field (which really isn’t a timestamp as much as a unique value) which can be added to tables. That timestamp field is automatically changed only when the record is written to. This is used in the case where our View does not have all the fields (perhaps for security) but we want to make sure that if anything changed in the remote data, even if not included in our View fields, the update should fail.
The final setting in this dialog, and the one which was not set for us by default, is the most important. In the bottom left corner you will see a checkbox for “Send SQL updates”. Until this is turned on, it doesn’t matter what other settings are made for the view… Updates will not be sent.
Our back-end table (tblFriends in the SQL Server database Friends) has automatic Primary Key generation by way of an “Identity” property on the Primary Key field. (This is the Friend_id field.) If we accept the Update Criteria defaults in our View definition, the Primary Key will not be updateable. (This is probably the best approach anyway.) We would normally not show our user this Primary Key and when a record is added to the View, it is sent to the back-end without a value in this field. The back-end then supplies a valid Primary Key value.
If we choose to supply our own Primary Key values, we cannot do this with a column marked as an Identity column, unless on the back-end we enable this capability. With SQL Server it is done by setting the IDENTITY_INSERT property to ON.
Here our back-end table has a field-level validation rule and we have two options on how to handle this. We can either send our data to the back-end to have the validation handled there (and then deal with the error if it fails) or we can do our own validation in VFP and only once that has passed, would we send the data back. Of course this would mean the data gets validated two times instead of just once, but in my opinion I’d rather do that extra checking than send data down the wire which might not pass the validation. When I send my updates, I know the data will be accepted. When we use Remote Views in the next session “How to Use Remote Views with Forms” I’ll show you how to do this.
In this example, our back-end table has trigger code for Referential Integrity. Specifically we will try to add a Card record for a Friend who doesn’t yet exist in the Friends table. Our Insert Trigger code in the C/S system needs to return an error condition (with SQL Server, we use the RAISERROR command) otherwise our attempt to add the record will simply fail without any kind of notification. See the EG-TRG.SQL script supplied with the source code for this session to see how our insert trigger for tblCards was constructed. The trigger is called “trgtblCards_ins”.
In order to handle errors, we first need to be told about the error. This is normally done on the C/S system where we (with SQL Server) use the RAISERROR command to trigger an error condition. With RAISERRROR, the error will be logged in the C/S system, and will be sent back to the front-end application so that we can trap for it with VFP. Error numbers above 50000 are reserved for use by our applications. Here is an example:
RAISERROR 60000 ‘Trigger trgtblCards_ins on table tblCards:
Card for non-existing Friend’
Sooner or later you will run into a situation where you know the View Designer will not have enough power to meet your needs. What to do then? Luckily for us, we don’t have to resort to writing the entire View definition ourselves by hand. In one of the sub-directories beneath the VFP installed directory you will find a program called GENDBC.PRG. This is a very valuable gem for us, and many developers are unaware it exists.
One of the main uses of GENDBC is to produce a backup, in code, of our entire DBC with all the rules, relationships, triggers, etc. Once I have finished my design I always run GENDBC to make a backup of everything. I’ve even sometimes used it at a client site to create all the tables, indexes, primary keys, relationships, etc. Think of it as an insurance policy.
The other reason why I’m recommending GENDBC to you is for those times when you need to extend the power of Views beyond what can be done with the View Designer. You can run GENDBC, find the code that was created, copy it out to another file and edit that file to get beyond whatever limitation you had run into. I then normally delete the original view from the DBC and simply run my new code to recreate the view, putting it back into the DBC automatically. The only thing I give up (depending on what I’ve changed by hand) when doing this is the ability to make further modifications to the View definition with the Designer.
Here is the View definition code for one of the simple Remote Views we use in this session (with many of the fields have been left out to save space – if you need to see the entire code, run GENDBC yourself on the sample DBC provided with the session code):
***************** View setup for RV_CARDS ***************
CREATE SQL VIEW "RV_CARDS" ;
REMOTE CONNECT "conFriends" ;
AS SELECT * FROM dbo.tblCards Tblcards
DBSetProp('RV_CARDS', 'View', 'UpdateType', 1)
DBSetProp('RV_CARDS', 'View', 'WhereType', 3)
DBSetProp('RV_CARDS', 'View', 'FetchMemo', .T.)
DBSetProp('RV_CARDS', 'View', 'SendUpdates', .T.)
DBSetProp('RV_CARDS', 'View', 'UseMemoSize', 255)
DBSetProp('RV_CARDS', 'View', 'FetchSize', 100)
DBSetProp('RV_CARDS', 'View', 'MaxRecords', -1)
DBSetProp('RV_CARDS', 'View', 'Tables', 'dbo.tblCards')
DBSetProp('RV_CARDS', 'View', 'Prepared', .F.)
DBSetProp('RV_CARDS', 'View', 'CompareMemo', .T.)
DBSetProp('RV_CARDS', 'View', 'FetchAsNeeded', .F.)
DBSetProp('RV_CARDS', 'View', 'FetchSize', 100)
DBSetProp('RV_CARDS', 'View', 'Comment', "")
DBSetProp('RV_CARDS', 'View', 'BatchUpdateCount', 1)
DBSetProp('RV_CARDS', 'View', 'ShareConnection', .F.)
*!* Field Level Properties for RV_CARDS
* Props for the RV_CARDS.card_id field.
DBSetProp('RV_CARDS.card_id', 'Field', 'KeyField', .T.)
DBSetProp('RV_CARDS.card_id', 'Field', 'Updatable', .F.)
DBSetProp('RV_CARDS.card_id', 'Field', 'UpdateName', 'dbo.tblCards.Card_id')
DBSetProp('RV_CARDS.card_id', 'Field', 'DataType', "I")
* Props for the RV_CARDS.friend_id field.
DBSetProp('RV_CARDS.friend_id', 'Field', 'KeyField', .F.)
DBSetProp('RV_CARDS.friend_id', 'Field', 'Updatable', .T.)
DBSetProp('RV_CARDS.friend_id', 'Field', 'UpdateName', 'dbo.tblCards.Friend_id')
DBSetProp('RV_CARDS.friend_id', 'Field', 'DataType', "I")
* Props for the RV_CARDS.occasion field.
DBSetProp('RV_CARDS.occasion', 'Field', 'KeyField', .F.)
DBSetProp('RV_CARDS.occasion', 'Field', 'Updatable', .T.)
DBSetProp('RV_CARDS.occasion', 'Field', 'UpdateName', 'dbo.tblCards.occasion')
As you can see, there is much more to the View definition than just the simple SQL SELECT statement. Besides the actual SQL SELECT statement, you might be interested in changing many other properties (as mentioned earlier, see the VFP Help File for the topic DBGETPROP() for a list of all available properties).
There is also code for establishing the Connection settings:
***************** Connection Definitions CONFRIENDS ***************
CREATE CONNECTION CONFRIENDS ;
DATASOURCE "Friends" ;
USERID "Eldor" ;
=DBSetProp('CONFRIENDS', 'Connection', 'Asynchronous', .T.)
=DBSetProp('CONFRIENDS', 'Connection', 'BatchMode', .T.)
=DBSetProp('CONFRIENDS', 'Connection', 'Comment', '')
=DBSetProp('CONFRIENDS', 'Connection', 'DispLogin', 1)
=DBSetProp('CONFRIENDS', 'Connection', 'ConnectTimeOut', 15)
=DBSetProp('CONFRIENDS', 'Connection', 'DispWarnings', .F.)
=DBSetProp('CONFRIENDS', 'Connection', 'IdleTimeOut', 0)
=DBSetProp('CONFRIENDS', 'Connection', 'QueryTimeOut', 0)
=DBSetProp('CONFRIENDS', 'Connection', 'Transactions', 1)
=DBSetProp('CONFRIENDS', 'Connection', 'Database', '')
Please run the supplied scripts in the following order to create the files/data needed for the SQL Server examples.
|Supplied SQL Script Files|
Creates data and log devices and database
Creates tables and indexes
Creates test data for all tables
If you don’t have SQL Server available to test with the sample data, we have provided a VFP database and tables to as closely as possible match what we created in SQL Server. You should copy the files to your hard drive and then using the ODBC Manager, establish a System DSN to this database. From there on, you should be able to follow our examples using this data instead of real C/S data. See the README.TXT file with the source code for further details.
Two of the better books on the topic of Client/Server development are Robert Green’s Developing Client/Server book (now part of the Microsoft Press book “The Pros Talk Visual FoxPro 3”, ISBN: 1-57231-233-5), and Stamati Crook’s “Red”. You can get information on Red at http://www.redware.com. One of the best books on SQL Server is “Professional SQL Server 6.5 Admin” from Wrox Press. ISBN: 1-874416-49-4
These sessions are really a “work in progress” and you might want to check on my web site periodically to get the latest version of this article and source code. My web site is at http://www.elgem.com.